Wednesday, January 8, 2020

Schismatic Tuning

I am fascinated by musical tuning, from the conventional 12 equal steps per octave to all sorts of wild possibilities. Sometimes when I am out playing on the fringes, I learn something that can bring me back into much more conventional possibilities. So here is a way to tune a conventional keyboard. I'll call it "schismatic tuning" but I don't doubt that it has been explored again and again in the past.

Musical tuning is essentially a branch of mathematics, especially the way I approach it. Often in mathematics and science the focus is on novelty, on fresh discoveries, where fresh means not previously encountered by the human mind. This focus is unnecessary in math and science, is somewhat distracting or misleading, and will likely serve us much less well in the future. Since the time of Kepler and Galileo, math and science have expanded in a stunning fashion. Predicting the future is a fools game, but it seems unlikely that environmental constraints will permit continuous growth in extracting resources and dumping wastes. The modern trend of constant growth seems destined to end sooner rather than later. Math and science will be of great value in any post-growth society. To keep them alive, though, the focus will need to shift away from novelty.

So here is a historical preface to the schismatic tuning I have (re)discovered: Emilio de’ Cavalieri’s mysterious enharmonic passage - a modern rendition of a renaissance recovery of an ancient Greek tuning! Paul Erlich has written a thorough discussion of tuning A Middle Path Between Just Intonation and the Equal Temperaments - I have barely scratched the surface of this paper! I imagine that schismatic tuning is described in there somewhere! I would just like to share my (re)discovery here of this one small facet of the vast universe of tuning. I offer it as an invitation to explore further!

A quick review of fundamentals. A musical interval is the relationship between two pitches, which can be analyzed as the ratio between their frequencies. If pitches are an octave apart, their frequencies are in a 2:1 ratio; a fifth apart, a 3:2 ratio; a major third apart, a 5:4 ratio. These ratios are ideal. Just Intonation is a tuning that uses these ideal ratios. But for a variety of practical reasons, it is often useful to adjust, or temper, these ratios. There is no perfect solution to the puzzle of temperament. Modern keyboard tuning adjusts the fifth to 2^(7/12) ~= 1.4983 and the major third to 2^(4/12) ~= 1.26. The human ear can detect reasonably well the difference between this tempered major third and the ideal of 1.25.

Schismatic tuning is actually a family of tunings. I will present one version, based on dividing an octave into 53 equal steps, rather than the conventional 12. With 53 steps available, a fifth is tempered to ~1.499941 (31 microsteps) and a major third to ~1.248984 (17 microsteps). The fifth is improved, but the conventional tuning was already very good; the main improvement is in the major third.

How can such an improved tuning be adapted to a conventional keyboard? Here is my proposed schismatic tuning:

The top row names the keys on the keyboard. The second row gives the number of microsteps from the low C to the particular key of that column. The bottom row re-expresses that pitch in terms of cents. The conventional tuning would result in pitches of 0, 100, 200, 300, etc. cents. So this last row makes clear the difference in pitch between the schismatic tuning and conventional tuning, e.g. D is 3.774 cents sharper.

Some points to observe:

  • Almost all of the fifths are 31 microsteps, i.e. very accurate. From D to A is only 30 microsteps, though.
  • Four of the major thirds are the ideal 17 steps: C to E, F to A, G to B, and D to F#. The others are sharp by a microstep, i.e. closer to a pythagorean major third, 81:64.
  • The sizes of the chromatic intervals in this tuning are not all the same: 4, 5, 4, 4, 5, 4, 5, 4, 5, 5, 4, 5.
  • The syntonic comma is not tempered. E.g. moving by fifths up from C to E, one must cross the "wolf" fifth from D to A. This is a distinctly unconventional tuning.

One can certainly play in any key signature with this tuning - none of the intervals is too far off. But certainly a piece of music will sound different when the key signature is changed. This tuning does allow though a simple dynamic shift as outlined in my post Dynamically Tuned Piano. With perhaps a push of a foot pedal, A can be sharpened by a syntonic comma:

or a different pedal could instead flatten the D by a syntonic comma:

These shifts will move the wolf fifth up or down a fifth, and also rotate which major thirds are pythagorean, etc.

Tuesday, November 5, 2019

Rotta Model for Bicycle Tires

I recently discovered Andrew Dressel who recently wrote a PhD dissertation Measuring and Modeling the Mechanical Properties of Bicycle Tires. There's a lot in there, of course, but amongst all that I found a description of the Rotta model for tires. This is much the same as what I have been exploring. I polished up my math and wrote a bit of new software to generate some new tables for recommended inflation pressures base on Frank Berto's rule of 15% squish.

The Rotta model is quite simple:

The tire is flat where it contacts the ground, and has a circular cross section where it is not in contact. Each cross section of the tire is treated independently.

Here are tables from which inflation pressure can be computed. The rows correspond to tire widths, in mm. The columns represent rim width, as a ratio to tire width. E.g. the column with 2 at the top is for tire width twice the inner rim width, e.g. a 50 mm wide tire on a 25 mm wide rim.

The tables give the area in square inches of the contact patch. To compute the inflation pressure, divide the load on the wheel by the area of the contact patch. E.g. a 50 mm tire on a 25-559 ETRTO rim will have a 2.06 sq in contact patch when it is squished down 15% of its width. For that contact patch to support a 100 pound load, the inflation pressure should be 100 / 2.06 = 48.5 PSI.

For 622 BSD:

For 559 BSD:

For 406 BSD:

These tables are calculated purely from theory, i.e. no parameters are used to fit them to any experimental data. Do not follow them blindly! They're food for thought & perhaps provide a useful starting point for exploration.

And a 305 BSD table:

Friday, September 13, 2019

A Simple Fix

Chris Hedges recently wrote about some top CEOs who proclaim the positive value of capitalism. These terms and ideas get packaged and promoted so heavily that it becomes very difficult to see how they actually work. I'd like to take them apart a little bit and suggest a simple change that could have profound effects.

Our large scale society functions by way of complex organizations. Our welfare, indeed our survival, depends on the smooth functioning of these organizations. These organizations can only maintain themselves by earning profits, i.e. by selling finished products for prices high enough to pay for raw materials, maintain factories, pay workers and managers, etc. Investors who contribute start-up and expansion up-front costs need to earn a reasonable return on their investment, or they'll find something better to do with their money.

The distortion at the foundation of many of our problems today is that investors hold the ultimate decision-making power of corporations. Corporations are created and structured by means of the law. The law gives investors this power. A change in the law would change the way corporations are organized.

The classic straw-man for corporate governance is government control of corporations. But there are many other alternatives. Much of our problem today is that corporate power is too centralized. Government control would increase centralization. What we need is to distribute power.

Corporate governance needs to be widened. The Board of Directors of a corporation, the embodiment of regular decision-making, should include representatives from the workers and the managers, from suppliers and customers, from the local community, from folks living downstream and downwind, as well as from investors. Investors should not, in most cases, have a majority vote. Investors deserve a fair return on their investments, but this should be balanced with the legitimate concerns of the other stakeholders of the business.

It would be a catastrophic error to blame all our ills on big businesses and therefore to work to destroy those businesses. The fabric of our society consists of the goods and services flowing through these businesses. A small adjustment to the steering mechanism would have a profound effect on the path of evolution of this fabric. This is a simple and practical step that could fix the worst excesses of our current system.

Friday, August 16, 2019

Beyond Wonk Perfection

Some years back, the Washington Post's Wonkblog published an example of how precincts might be partitioned into districts unfairly versus fairly.

I would like to propose an improved partition, better than the "perfect" example of the Wonkblog:

In the Wonkblog's perfect partition, each district is entirely red or entirely blue. This makes the representation very rigid. Shifts in voter sentiment will not be reflected in shifts in representation, until the shifts are massive. In the partition I propose, some districts have slim majorities and others have large majorities. This approach is less rigid.

Sunday, June 30, 2019

Partitioning the Vote

My earlier notion for preventing gerrymandering was not very effective. Looking at the data for the Michigan 2018 election, the Democrats had a majority of the vote but a minority of the districts. My proposed rule would have done nothing to prevent that.

Exactly what a good rule might be, I don't know. It's a difficult problem! But the basic criterion that seems logical is that the fraction of districts won by each party should be roughly proportional to the fraction of votes for that party. A general shift in voter preference should be reflected with a proportional shift in election results, in the fraction of districts won by a party. This would be achieved by having district boundaries drawn so that the fractions of votes for each party across the districts varies across a reasonably wide spectrum.

The challenge with this approach is that the notion of "a general shift" is too vague to be of much use. Still, a simple model can get the idea across.

Here is an idealized situation, a state with ten districts. When the vote is split equally, the range of district results could be:

With a general drift to 60% of the vote for one of the parties, the number of districts won could change proportionally:

A further shift, to 70%, would continue to be reflected in the proportion of districts won:

Thursday, June 27, 2019

Against Gerrymandering

The notion of a free market strikes me as oxymoronic. It's like a game without rules. It's nonsense. A game is defined by its rules. Similarly, markets are defined by rules. Of course, good games and good markets have rules that are fair, for example. Finding ways to structure markets effectively is a worthy challenge.

The recent Supreme Court decision, to keep the federal judiciary out of the business of how to draw the boundaries of legislative districts, strikes me as a move in support of a free political market. Why try to figure out what a fair market might look like? Let the market decide what is fair! But the problem with anarchy is that it is impossible. We can decide on rules using methods developed over the centuries and codified by brilliant political thinkers like the framers of the United States Constitution, or we can revert to the cruder methods of tyrants.

Of course the framers didn't provide all the answers. We the people have the responsibility, working with the general framework set out in the constitution, to work out the rules and regulations, the laws and institutions, by which we may prosper fairly and equitably.

It's clear enough that legislative boundaries can be gerrymandered to amplify the dominance of whichever party. Even if some branch of the government took on the task of preventing such corruption, it's not so clear what kind of rule could work against it effectively. I would like to propose a rule here that could work.

The basic trick used in gerrymandering is to concentrate the voters of the minority party. The voters of the majority party are spread across districts, so there are just enough to win in a very large number of districts.

So here is an effective anti-gerrymandering rule: for each x, there cannot be more districts with more than (50+x)% minority party voters than there are districts with more than (50+x)% majority party voters.

In other words, minority party voters cannot be concentrated in districts more than majority party voters are.

This is a simple rule that would prevent the worst abuses of gerrymandering.

Thursday, November 29, 2018

Thermodynamics

Canvas not supported; please update your browser.

log(Temperature) =

Energy =

This little project is based on Dan Schroeder's Ising Model animation.

The way this kind of simulation works is that cells in the array are considered one by one. At a particular cell, alternative configurations are examined and one configuration is selected to be newly assigned to the cell. Each possible configuration, or value of the order parameter, will have an energy which is determined by the configurations of the neighboring cells. The new configuration is picked at random, with a bias toward lower energy configurations. The lower the temperature, the stronger the bias.

One of the tricky parts about this kind of simulation: when the order parameter is continuous, it's hard to come up with a good set of alternative values. The full set of alternatives is huge, practically infinite. Some small sample needs to be used. The small sample needs to include some configurations whose energy is at least not too much worse than the existing configuration, or the simulation won't have a good alternative to assign to the cell. At high temperatures, suitable alternatives can be substantially worse. At low temperatures, the alternatives can only be slightly worse. Thus, the temperature determines what makes a good set of alternatives. Trying to figure out that relationship ahead of time is too complicated, so an adaptive approach is suitable.

What I do in this simulation is to consider alternatives one by one. The first alternative is chosen at random at random from the full set. Each next alternative is chosen from those that are within some range of the currently assigned configuration. As each configuration is chosen, a decision is made whether to assign that to the cell in place of the current configuration. As soon as a new configuration is assigned, the simulation moves on to look at other cells. When an alternative is not assigned, another alternative is chosen from a narrower range. The range from which alternatives are picked gets narrower and narrower with each iteration. Eventually the alternatives are so close to the current assignment that the odds of being assigned become 50/50. This narrowing range iteration provides an adaptive approach to picking a set of alternative configurations.