Knowing exactly what is happening - the lost art of programming

The comfortable delusion of infinitely increasing computer speed is shattering - time to learn some real programming!


We are often faced with the idea that hand writing assembler will give the fastest code. Next up the food chain is very low level C and then C++ followed by, maybe, Java and tailed by something akin to Python. Does this all make sense? Is this the real situation or maybe correlation does not imply causality? In this post I would like to suggest it does not. I posit that knowledge of the underlying mechanisms employed by a program is a much bigger factor than the choice of language. If we were to read a file one line at a time from disk (a real spinning disk) then the performance of the system would be entirely dominated by the buffering strategy and not by the code reading the lines. Which language (Java, C and such) is use would make no noticeable difference at all. In such a case, hand writing assembler would be a very bad design choice as the chances of implementing a sub-optimal buffering strategy would be high; maintaining that choice correctly over many software and hardware updates would be near impossible.

So what am I saying? I guess I am saying that understanding the way the computer's I/O system actually works is more important than the code we write. Understanding the underlying 'exactly what is happening' is more important than the details of the implementation language. The difficulty we face as developers is that software practice has moved towards abstraction. Mathematics gets in the way of computer programmers in so many ways. Some are foolish enough to see computer programming as a branch of mathematics. This makes about as much sense as saying architecture is a branch of physics. Similarly, educators, scared of frightening off young people from programming, hide the electronic nature of a computing machine behind interpreted, easy to understand programming techniques. This may well be a valid approach except the step to developer maturation - knowing what the computer is actually doing never happens.

Sadly, the notion that simply writing code in a low level language like C will make it run fast is also misguided. Indeed, writing 'next to the machine' with assembler or C is no guarantee of performance. humans lack the ability to process large amounts of related data which is required to perform some of the best optimisations for which modrern compilers are getting so profissiant. We might be brilliant at writing 20 instructions which are tailored for absolute maximum performance and not notice they these instructions can be elided in the majority of code paths due to branch speculation optimisation.

Nevertheless, we cannot rely on compilers to always arrive at an optimal solution and we definitely cannot expect them to work around poorly written code. What is required is a cooperation between compiler and developer. To achieve that we developers need to start to learn more about the dirty mechanics of code execution; we need to stop thinking in highly abstract ways and start to disassemble generated code. We need to look inside library objects like share_ptr and find out the mechanical details of what they do, not just skim over the high level theory.

Similarly, there is nothing inherently bad with using interpreters; as a high abstraction for pushing large computational tasks around they can be very useful. The moment we find ourselves performing that computation in an interpreted (or in truth - dynamic) language is the moment we have gone wrong.

Where does this leave abstraction? Do we have to jettison Haskell, Java and C++ in favour of C and Fortran? No, we can still work at abstraction where it helps - but we must not hide behind it.

  1. We need to accept that hot code needs to be fully characterised on the target hardware. Mathematical abstraction will not help.
  2. The free lunch of ever cheaper and faster hardware is over - computer performance has stagnated and costs are rising.
  3. Tooling must evolve beyond simple 'this bit of code is taking the most time' - we need to know WHY.
  4. We as developers need to stop hiding from reality - computers are machines, computation is engineering not mathematics or science.

1,2 and 4 are starting to sink in to the commercial worl at least. I know of a few places where 3 is in its infancy. The future will be a more hardware centric, intellectually challenging place: In my view - all the better for it.

 

VLog: Cathode Ray - why do supermarkets sell crap wine

I am starting a new project - a simple rant vlog

Episode 1



Enjoy

Is reverberation part of music?

What is reverberation? Is it an effect or is it part of music?

 
Sounds with extreme reverb'

Reverberation (or reverb') is considered an effect. We get guitar pedals for it and we twiddle dials to introduce it. Indeed, that modern trend of perfecting the life from music has introduced rules to reverb' so the it all sounds equally bad on our iPods during the morning commute. Before we start to think differently about reverberation a good step will be to understand were it comes from and why it matters.

A tiny part of hearing happens in your ears. Mistaking your ears for the things which hear is a bit like mistaking a microphone to an iPhone; only much worse! The processing of sound is done in real time in your brain. Interesting, the processing is done in ways which resemble realtime processing in modern computer systems; I guess us developers cannot help but mimic the beauty of nature. When I say 'real time' what I mean is within a fairly fixed amount of time after the signals reach our brains. If there is more auditory information coming in than our brains can process in the time limits set by our neural systems, that information is simply dumped.

One outcome of information dumping is that if one stuffs enough effects, drums and yes reverb' on a piece of crap singer (yes Bieber -I am looking at you) the the brain starts to ignore the crapness. Listen to the same person single raw (even with autotune) and real quality (or lack of) will be laid bare. Conversely, natural sounds have enough extra information along with them to create artistic beauty.

Natural reverberation is the result of sound bouncing around our environment. Sound spreads though the air in three dimensions like an expanding sphere. When it impinges on an object it is reflected, absorbed, retransmitted and diffracted. It is also worth noting that air is a non linear transmitting medium and different frequencies of sound travel through it at very slightly different rates. If we go back to the bouncing idea and consider a hand clap we can imagine that initially we will hear the clap. The sound of that clap will impinge one our ears after a short delay as the sound has to travel to them. But what then?

We will not just hear the hand clap. Even in a perfectly damped studio we will hear an echo of the clap as it bounces off our shoulders. If we are wearing heavy cloths then the shoulder reflection will be damped, especially at higher frequencies. If you hold your hands up to clap the reflection will be louder but more delayed than if your hold them low down.

Now we are getting somewhere! The threads of my seemingly random ranting are coming together. Our brains can take the frequency information, delay and amplitude of the reflection from our shoulders and use it to help place a sound source in our environment. However, real environments are more complex than that; they have walls, floors, trees, chairs, other people and other instruments in them. Our ears will pick up reflections from all of these. Information from all these sources goes together to help give us a perception of the space in which a sound is situated.

Let us go back to that hand clap. If we clap in an empty rectangular room what will happen? I simple (and wrong but often used model) is that the sound will bounce off the walls floor and ceiling in a direct resonant fashion. We will thus hear 6 echoes. But the the sound which bounced off the floor will hit the ceiling and come back at us. This happens to all the sound, it bounces back and forth 'reverberating' in the space. Each time it passes our ears we hear it and perceive reverb'

If we make a reverberation system in software or hardware based on this model it sounds very artificial. The sound is often called 'metallic' and for longer reverberation times it starts to ring like a bell (a multiple frequency resonator with inharmonic frequencies always sounds metallic). Real rooms don't tend to ring like a bell; what is going on?

Let us forget about sound for a while and consider a bouncy ball in our room. When I was a kid there were these see through balls one could buy which seemed to bounce for ever. What happens if we throw one of these at the floor as hard as we can? Well, it will bounce. Indeed it will bounce and hit the ceiling. Then it will bounce back at the floor. Could you throw the ball so that it bounces back and forth in the same exact spot indefinitely? I think not. In this situation the ball will not only bounce up and down but it will move around the room. Even if it were thrown perfectly vertically, slight inconsistencies in the floor, ceiling and even the air through which is passes will knock it off course. Eventually our ball will get to the edge of the room and impinge on a wall. At that point we will see its so far structured bouncing devolve into a near random bouncing pattern as it. Given enough bounce our ball will eventually pass through every point in space contained within the room. That is a very long way from the simple 'bounce back and forth' model of an over-simplified reverberation system.

"But sound is made of waves - you know - sound waves. What is all this bouncing ball nonsense?" Well we can think of sound as waves and as particles. Just as photons are particles of light, phonons are particles of sound. Given this insight from physics I will continue with my particle based explanation of reverb' and expect no more interruptions from the back of the room.

OK, so we have heard how a single ball/phonon will bounce around a room. But it does not explain what we hear. There are two more steps. First we need to understand when we will hear our single ball. We 'hear' it when it comes back and hits us. We then change in some way. Our model of a ball fails at this point because balls slow down when they hit things but phonons change pitch and or dissipate. So, we now can consider that our phonon bounces off the wall and it might come back at us so we hear an 'early reflection'. It might have to bounce off several walls before it gets back to us in which case we will hear a late reflection. The chances that it will dissipate (get converted to heat in a wall or fly out the room all together) gets higher the longer it bounces around. So, the earlier it gets back to us the more likely it is to make it. The higher the pitch the more likely it is to dissipate at each bounce. Reverberation dies away and low frequencies die away more slowly.

Now we can consider a huge bunch of phonons; when we clap our hands they will all fly away from the sound source forming an expanding sphere. Some will hit us and we will hear those as the original sounds. Then some more will bounce of a wall, ceiling or floor and come right back and hit us. Those will be the early reflections. Over time others will bounce around a bit and eventually come back and hit us. We will get the original sound and early reflections as distinct events whilst after a few seconds balls will start hitting us more or less at random. Does that sound familiar? It should, that is what reverberation sounds like!

Phonos will interact with one another as well as walls and such. They constructively and destructively interfere with one another. This will eventually cause them to resonate with the space in a room. This is why sounds in a small hard walled room (like a bathroom) can take on a metallic edge. However, this resonance effect is not normally dominant in real life reverberations. Indeed, we tend to avoid recording sounds in spaces which resonate that way; thus modelling reverberation on resonance is not going to create a natural sound.

Our brains are able to process a lot of this information and convert it into a sense of the space in which the clap was created. We hear reverberation as an auditory image of the environment around a sound. The rate at which high frequency die away compared to lower ones tell us something about the harness of surfaces. The time of early reflections tells us how far away the walls, floor and ceiling are. The difference in time between the original sound and the first reflection (often off the floor) will tell us how far away the source of sound it. We can see with sound!

From a musical standpoint I believe reverberation is part of music. The room in which music is played becomes an instrument. Playing a majestic piece (I always think of the slow movement from Beethoven’s seventh) in a room with no reverberation would rob it of a lot of its power. Sticking a metallic sounding artificial reverberation onto otherwise dry (reverberation free) sounds is just as bad or worse.

So, now that we have some view of what reverberation is I hope you will agree that is it part of music and not just and effect. Once it becomes an effect we are probably not trying hard enough. As a final thought; this modern trend of matching the parameters (pre-delay etc.) of a reverberation effect to the timing of a piece of music is legitimate in the same way as auto-tune is legitimate: I.e. Not at all.