Wednesday, January 26, 2011

Designing Complex Systems

What kind of world and nation do we want? What are the realistic possibilities from which we are constrained to choose? Already these are enormously difficult questions. Perhaps foreign relations are the trickiest questions. Our present way of living is deeply intertwined with people all around the world. One simple measure of this is that we import about two trillion dollars of goods, while our total GDP is about fifteen trillion dollars. The more our vital interests are involved with other nations, the more we will find ourselves driven to protect those interests. This activity powerfully shapes our government. On the other hand, it's not as if utter isolation is a real option.

A similar spectrum appears at the national level. Perhaps the Tea Party is a revival of the Confederate ideas that sparked the nineteenth century Civil War. Our public and private institutions are so international that the boundaries between the fifty states are almost invisible. To what extent it could be possible to reconnect institutions to smaller regions, to states or even to cities... it is not clear how this could be done and what the consequences would be. The economic principle of comparative advantage is one demonstration of the more general maxim, "United stand, divided we fall." Yet surely that maxim must have its limits.

What we have now is a way of living that is dominated by global institutions, public and private. It surely makes good sense to avoid concentration of power in any institution, but the protection of our vital interests and the regulation of interactions among such huge players surely requires government at a similar scale. In a different world a small scale government might be effective. In today's world, the only possible government is a large and complex government. The challenge is to find a way to make such a government efficient and effective.

In this project, we don't have to start with a blank sheet of paper. There are many global scale institutions and other systems of great complexity. The pragmatic approach is to study what works elsewhere and what doesn't work.

There are two aspects to building complex systems. The system itself will have some structure, e.g. incorporating hierarchy and repetition. The other aspect is the process of design. Just like there is no single system structure that can serve all purposes, similarly the design process needs to fit the specific problem at hand. Still, there are some generally useful design methods that can be combined and adapted to fit most any problem.

Complex system design generally takes place in a series of phases. Design can be thought of as a series of decisions. The idea behind the phases is to order the decisions as well as possible. The main principle is that a decision is best made when the consequences of the alternatives are as clearly known as possible. For a trivial example, the size and shape of a container is generally best decided upon after the sizes and shapes of the contents are already known, otherwise the contents might well not fit in the container, or it could be unnecessarily large. The usual first phase of designing a complex system is determining what it will be required to do and the constraints put on it by whatever external factors.

Another methodological principle is to identify and resolve risks early. The failure of a crucial component can prevent the system from performing its intended function. The less experience designers have in working with such components, the higher the risk of a design error. In such cases it may be worth while to build prototypes as a way to gain experience before scaling up to the full system.

Complex systems are generally designed by large teams with diverse expertise. Transparency and accountability are necessary in the design process in order for this diversity to remain a strength. For the designed system to cohere, effective communication must be maintained between the various design teams. Hidden decisions become hidden problems that can trigger catastrophic failures.

Any complex system will involve risk and novelty: each unique arrangement of components creates fresh opportunities for unforeseen interactions. To work effectively, a complex system must incorporate ways for its design to be changed as problems are detected, or as system requirements shift. The processes of design, implementation, and deployment cannot be kept completely distinct.

Separation of concerns is a principle that can apply both to system structure and to the design process. If every component of the system is tightly coupled to every other component, the system will be extremely brittle. Most likely some interactions will not be handled correctly and the system won't even work from the start. In any case, any small change in requirements will lead to updates rippling through the whole system, at prohibitive expense. The notion with separation of concerns is to reduce the degree to which system components constrain each other, to reduce the level of coupling between them. The detailed operation of each component can be largely independent of the details of other components, as long as a few basic requirements are met. Changes in the design details of one component can have very little or no impact on the requirements of other components. This lets design teams work more independently with many fewer iterations of rippling changes, and then during implementation and deployment problems can also be identified and fixed with minimal impact.

I starting thinking about government as a complex system when I started hearing about the problems with earmarks in legislation. It seems many such clauses are introduced into bills at the last minute. Legislators vote on bills whose contents they cannot effectively track. This is very much like computer software when configuration management tools are not used. Last minute changes are notoriously buggy but developers always want to get the latest fixes and features into the next product release. The software industry has worked out some effective methods to manage this problem. Perhaps the government can learn something!

Tuesday, January 11, 2011

Solar Fruit Dryer

The current process by which food is provided to most folks in the United States is one that uses large amounts of petroleum, from the farm through the distribution network to the home. As petroleum and other sources of energy get more scarce, we will need to find new ways - and return to old ways - that are less energy-intensive. These changes will be required through all the stages of growing, preserving, and distributing food.




Here is a simple design for a solar fruit dryer. The lower part is a trapezoidal solar collector. The top surface of the collector is clear glass. This covers a space for air to flow, in from the bottom, up through the collector as it is heated, then into the drying chamber above the collector. The bottom surface of this air space is a metal surface painted black, to absorb sunlight and turn it into heat. Below the metal surface is a layer of insulation, so the heat from the metal surface goes into the flowing air above it rather than the outside air below it.

The heated air rises and flows from the solar collector into the drying chamber. This air passes over several trays of drying fruit. The air is dryest as it enters the chamber, so the first tray it encounters is that with the dryest fruit, i.e. the fruit closest to the end of drying process. As the air rises, it passes over successively less dry fruit. Finally the air rise out through the top of the drying chamber and into the chimney.

The chimney is simply a vertical tube that helps the rising air accumulate lifting power to keep the steady movement of air in at the bottom of the solar collector, through the dryer, and out at the top of the chimney.

The drying chamber has a door allowing new trays of fresh fruit to be added at the top and trays of dried fruit to be removed from the bottom. As trays are removed from the bottom, the remaining trays should be shuffled down, creating space at the top for the fresh trays to be added.

The next stage of development of this idea will be to tune the relative sizes of the components, to allow maximum throughput of fruit with the least expense. A moderately sized unit ought to be inexpensive enough to build that this tuning can effectively be done experimentally.

The grand vision is that the dried fruit can be taken by bicycle to local farmstands to be sold, and then fresh fruit to be dried can be brought from the farmstand on the return bike trip. Mix in canoes as needed!

Sunday, January 2, 2011

Twisted Arpeggios

I took some piano lessons in 7th grade and a few guitar lessons in 8th grade, but never attained any real competence in performance. I still have no understanding of music theory at all: counterpoint, harmony, etc. It was sophomore year in college when I learned about the overtone series in a physics class that my fascination with music opened to a deeper dimension. I got out my old guitar - I think I spend a year just tuning that guitar. I also started exploring the mathematics of tuning.

In the spring of junior year I wrote a computer program to search for scales made of equal sized steps. The conventional musical scale has equal steps whose size is a twelfth of an octave. Perhaps some other step size could give a better scale. What would be a good measure for the quality of a scale, or of the step size that generates that scale. With the overtone sequence as the foundation of musical intervals, the answer must be that the scale contains pitches that are close approximations to the just-tuned intervals. In my program I just used the overtone series itself: the scale should contain pitches that closely approximate the pitches in the overtone series. Clearly the lower overtones are more important. So I formed a weighted sum of differences. Each difference was the gap between an overtone pitch and the closest pitch in the scale. The weights decreased as one ascended in the overtone series, in such a way that the sum would converge as one extended the sum to include all of the infinite pitches of the overtone series.

Any fixed step size will generate a scale for which this quality metric can be computed. In my program, I computed this metric for a range of step sizes, e.g. 1.050, 1.051, 1.052,… 1.067, 1.068, 1.069, 1,070. I checked the quality metric for each step size against those of the neighboring step sizes. At the optimum step size, the quality metric will be better than those for both slightly smaller and slightly larger step sizes. I expected to see just one or two step sizes singled out as those with the best quality metric, i.e. those that would generate the best musical scales.

This was 1976, so my program was punched into cards and read into the job queue of the campus mainframe. Half an hour or so later the resulting print-out was tossed into my slot. I was totally surprised - practically a third of the step sizes were the best in their little region. A graph of quality metric against step size would not show any kind of smooth approach to a few optima, but rather be very bumpy. I decreased the step size, so now I would explore step sizes like 1.0500, 1.0501, 1.0502, etc. Another half hour wait in the ready room for my result print-out to get dropped in my slot, and an even bigger surprise - zooming in to finer detail revealed even more local optima! This was one very bumpy curve!

I had the idea of going over to the math department to see if someone could explain to me what was going on. Such a bumpy curve didn't fit the picture I had in my head from basic calculus! But the school year was over. The summer of 1976 took me out to Grand Junction, Colorado, where I was working on historical uranium data with a couple geology professors, Ken Deffeyes and Ian MacGregor. For a little summer reading I brought along Functional Analysis by Riesz and Sz.-Nagy. Amazing! On page 3, there was my musical quality metric, or at least an infinite sum very much like it - given as a classical example of a function that is continuous everywhere but differential nowhere. A very bumpy curve!

It's nice to stumble onto classical results but unfortunately that put an end to strategy number one for discovering new and improved musical scales. Basic calculus can't analyze such functions! Strategy number two then emerged, probably still in undergraduate years. Another way to understand how a good musical scale arises is by looking a commas, rational fractions very close to 1 that are constructed by multiplying and dividing small primes.

One fundamental comma, the syntonic comma 81/80, comes up in tuning a guitar. A guitar can be tuned using harmonics, plucking the strings while touching their nodal points which causes the overtones to sound clearly. Playing the fourth harmonic of the low E string along with the third harmonic of the next A string, one can adjust the string tension to bring these to the same pitch, so the E and A strings will be separated by a just tuned interval of a fourth, i.e. a frequency ratio of 4/3. Of the five intervals separating the six strings of a guitar, four are fourths. The fifth interval, from G to B, is a major third. Here the fourth and fifth harmonics can be used to bring those strings into the corresponding just tuned frequency ratio of 5/4.

This approach to tuning a guitar doesn't quite work, though. There should be two octaves from the low E string to the high E string. But when the five just tuned intervals are stacked, the four fourths and the one major third, the top of the stack doesn't end up quite two octaves above the bottom of the stack. Numerically, (4/3)(4/3)(4/3)(5/4)(4/3) = 320/81, where two octaves should be 324/81. This gap, 324/320 = 81/80, the syntonic comma, is one of the most basic differences between just tuning and equal temperament. Is just tuning right and equal temperament wrong, or what difference does this difference really make? That is a nice question to explore!

Strategy number two involved looking for commas. I wrote a program to explore numbers whose factors were all small primes - as I recall, I limited the primes to 2, 3, 5, and 7. My program constructed these composite numbers in ascending order, and looked for pairs that were especially close. Perhaps those results from 30 years ago are still saved in some box of papers in my storage unit, but the details are long gone from my memory. The one result that I remember is that I became fascinated by an scale with equal steps that divided the interval from the fundamental to the tenth harmonic into 90 equal parts. Curiously, this scale would not have exact octaves. Still, it contained enough good approximations to important just intervals that it should support music.

Inventing musical scales is not much fun until two further problems are solved: one needs a way to play notes from any proposed scale, and one needs a way to arrange those notes into musical patterns. So my next idea was a way to lay out keys on some kind of keyboard instrument to allow performance of music that uses small step sizes. If the keys of such an instrument are laid out in a line in ascending order, there can easily be so many steps in an octave that either the keys will be too narrow to play or the octave will be too wide to reach. So I came up with a two dimensional arrangement, with hexagonal keys covering a flat surface. The pitch of the keys increases to the north and decreases to the south. To the east and west are arrayed keys that have very similar pitch. The fundamental idea is that a very narrow interval is actually quite distant harmonically. The keyboard I proposed arranges the keys according to harmonic relationships. Keys very close in pitch can be quite far apart on the keyboard.

The next problem is then how to compose music in one of these proposed scales. The easiest approach to start with is just to transpose or translate existing music into the new scale. If the new scale is based on the same fundamental intervals, such as perfect fifths and major thirds, as the old music, then the transposition should be simple enough… or so I thought! I took a couple of Bach's Harmonized Chorales and attempted the transposition. For any given note in Bach's score, I needed to select which of several closely spaced microtonal notes should be played. With the harmonic relationships available on my keyboard design, the range of choices was essentially a spectrum of notes separated by syntonic commas. I decided simply to use the color spectrum to add this detail to Bach's score. All notes of a given color would be related by Pythogorean intervals, ratios built from the primes 2 and 3. Each factor of 5 in an interval would shift one color in the spectrum, from blue to green etc. The interval of a syntonic comma is denoted by a simple shift in color with no change of position of the note on the staff. The transposition of Bach's score was simply a matter of copying the score and adding the appropriate colors.

I was amazed, not just because I found the task to be impossible, but also because the attempt seemed to reveal the structure of the music. Right up to near the end, I found it easy to see which colors to use for the notes. But the right near the end, I found conflicting harmonic relationships. It wasn't so much that I couldn't decide which was the right color, but more that the notes seemed to demand two colors at once. This conflict or ambiguity seemed to create a harmonic tension that was fundamental to the structure of Bach's music. Sure enough, the final notes of the compositions were again easy to color - the tension was resolved. That was my music lesson for the year!

About a decade after this, in 1991, I had the good fortune to be able to spend some time working at MIT as a visiting scientist - which also got me a library card! I found in MIT's music library a copy of Easley Blackwood's The Structure of Recognizable Diatonic Tunings. I was amazed to see in this book an exhaustive demonstration of the lesson I had learned from Bach. Blackwood attempts to transpose into just tuning a whole series of classical compositions from Bach to Mahler. He didn't use colors but simply little "+" marks to pick out of that spectrum of alternatives separated by syntonic commas. Blackwood shows that this transposition simply doesn't work. Music in this classical Western tradition creates a sense of tension and movement using the ambiguity introduced by temperament. The ambiguity cannot be removed without destroying the music.

I figured the book must have been the PhD dissertation of some obscure graduate student who then disappeared back into the woodwork. Another surprise! In the late 1990s I was in a CD store, probably looking for Beethoven - but right alongside, a section for Easley Blackwood's music! He is a distinguished composer as well as a theorist. His book on tunings marked a shift in his career, closing a phase of experimentation with tunings and returning to music very much in the classical tradition.

There are surely many different approaches to the construction of musical scales and no doubt many of them can provide foundations for musical making. If one wants to play in the space created by the ambiguities of temperament, the possibilities of alternative scales does not disappear. Thinking of ambiguity as a resource does, though, shift how one evaluates a scale.

My fascination for the last decade or so has been with the scale that divides an octave into 53 equal intervals. I've already outlined the logic behind 53 at http://www.interdependentscience.com/music/calliopist.html so I won't repeat that discussion here. This tuning gives very good approximations to just tuned thirds and fifths. But what if we want to play the ambiguity game? What new opportunities does this tuning open up?

The syntonic comma is the fundamental interval tempered by the conventional 12 step scale. Each step of the 53 step scale is just about the size of a syntonic comma, i.e. the 53 step scale accurately expresses exactly what the conventional 12 step scale blurs. The music of the 53 step tuning will be something quite different.

What is a comma that the 53 step scale blurs? The kleisma, 15625:15552, is one fundamental such comma. A minor third is 14 steps in this tuning, while a perfect fifth is 31 steps. If six minor thirds are stacked, that makes 84 steps - exactly a octave above a perfect fifth. Curiously, this comma is not tempered in the conventional 12 step scale, where a minor third is 3 steps and a perfect fifth is 7 steps. The stack of six minor thirds is thus 18 conventional steps, one step short of an octave above a perfect fifth. This again shows that music in the 53 step tuning will be quite different than that of the conventional 12 step tuning.

So my project has become a kind of parallel construction: to build up compositional methods for the 53 step tuning that mimic those in the classical western tradition using the 12 step tuning. Another fundamental compositional structure in classical western music is the key signature. For the most part, a segment of music will not use all twelve notes of the full scale, but instead be centered on a subset of seven. There are different ways to think about the way these seven are selected: perhaps there is a starting note and then a stack of six perfect fifths. Or the seven can be seen as three major triads stacked. These two ways to see the structure of the key signature differ by just the syntonic comma that is blurred by the temperament. If we see the white notes of the piano as a stack of perfect fifths F-C-G-D-A-E-B, then the A is four perfect fifths from the F. Alternatively, the white notes could be the three major triads FAC-CEG-GBD, in which case the A is a major third from the F. The just tuned difference between a major third and a stack of four perfect fifths is just the syntonic comma.

Can I construct some subset of the 53 steps of the octave, with a structure based on the kleisma that parallels that of the role of the syntonic comma in the conventional 12 step scale? Here is one attempt. The syntonic comma is four perfect fifths next to a single major third. The kleisma is six minor thirds next to a perfect fourth. Why not construct a key signature based on minor thirds? Take a stack of twelve notes, with a minor third between each neighboring pair. Then each note in the bottom six can be paired with a note in the top six, with the interval of a tempered fourth between each pair. This way, the roles of fifths and thirds is just reversed relative to the convention seven out of twelve key signature. Can a person make music out of this structure?

As a first step, here is a sequence of arpeggios:

http://www.interdependentscience.com/music/arpeggios.mp3

This parallels a conventional sequence like CEG DFA EGB FAC GBD ACE BDF CEG. In the conventional sequence, almost every triad includes a perfect fifth. Only the triad based on B has a flat fifth, because B is at the end of the stack of fifths. In a twisted sort of parallel way, almost every triad in this new sequence includes a minor third. There is just one triad that has a flatter third. I confess, I am just feeling around in the dark here for what might work!

In the conventional sequence, roughly half the triads have a major third from the root while the other half have minor thirds. In a twisted parallel way, this new sequence has about half the triads with fifths from the root and half with fourths.

Does this subset support music? That's not an easy question! Even more importantly, does is support making music that has something to say that couldn't be said in the conventional twelve step scale? That's an even harder question! My main hope is that I can help open up some possibilities enough that others with more compositional talent can explore more deeply.

Here, though, is my first attempt to create some music using this structure:

http://www.soundclick.com/bands/page_songInfo.cfm?bandID=784761&songID=7607461

I should note here that very likely none of these ideas at all were first proposed by me. The scale with 53 steps per octave has a very long history - centuries and likely millennia. Decades ago Shohé Tanaka named the kleisma and noted its importance in the 53 step scale. Fewer decades ago Larry Hanson selected a subset of the 53 notes that was very similar to the set of twelve that I proposed. See:

http://anaphoria.com/hanson.pdf

I learned a lot of this history from the microtuning group on yahoo:

http://launch.groups.yahoo.com/group/MakeMicroMusic/