That he does indeed. I like Andy Martin's demos for Reverb, and they both make a point of selecting tunes that one might normally and optimally use the effects with, but Bill's demos often go just a bit further and combine one or more other pedals to get exotic effects. Andy's are more in the realm of "here's what it does", while Bill's are more "here's what it could do". Of course, Andy is demoing any and every brand product that comes through Reverb, while Bill is concentrating only on EHX's, which I suppose gives him more time to study and think about their possibilities. I have no idea what Bill's role in or during product development might be, given his close attachment to EHX. He might simply be someone on a retainer that they hand a pedal to now and then to demo, or he might be in touch with the product-development people along the way, and be able to get a jump-start on thinking about how it could be used.Mark, I was just about to post this. Bill does a great job demoing those pedals.
When he says they "modeled" the shimmer from a compressor, a Pog and a Deluxe MM does that mean it is like a Line6 delay, instead of a real delay?From my reading, it would seem that the settings for each of the modes are saved, such that when you go back to them they are how you left them, but there is no capability to save multiple presets within a mode (e.g., 3 different versions of settings for shimmer)
I have a Strymon Timeline on my board. It is what I would call a "fake" delay. My 1978 Deluxe Memory Man is what I would call a real delay. It seems relevant to me with EHX delays because they make the very desirable and sought after analog DMMs. If the Grand Canyon is an anolog DMM style delay with digital controls, that would be what I would want over a digital modeled copy of the sounds from the DMM (DMM as an example). I do have my Timeline on my board, and it is nice, and more gig friendly than the old 1978 DMM, but the DMM kills the Timeline in tone and sounds and musicality. It is so nice. The difference is noticeable in the repeating decaying delays. With the DMM they sound very musical, the TL they become sort of angry robot like as they fade out..I'm not sure how you mean that, particularly when you say a "real" delay. Does "real" = something analog? Does it mean digital, but from scratch? Is a Line 6 delay something lesser, in your view? You've got me confused, Richard. My own sense is that they borrowed routines for their model of the DMM, borrowed octave-shifting routines/algorithms from their POG, and compression algorithms. If you have algorithms that people already like, why not use them over? On the other hand, the comp+DMM+POG could simply be a description of what it will sound like to the listener's ear (i.e., a description that will make sense to them), and is not explicitly those algorithms. Always worth not taking ad copy too literally.
I might have to sweet talk you into building me a delay someday. Or modding mine to have stereo output.In his otherwise flawed book on guitar effect pedals, Dave Hunter includes interviews with a number of notable pedal designers (Cornish, Vex, Fuller, at al.), one of them being Roger Mayer. One of the more interesting points that Mayer makes concerns the (what he feels are) audible differences between digital and analog delays. He felt their differences were primarily in the decay, where he contended that analog sounded more "musical" because the resolution is no different for lower-amplitude signals, when one is taking an analog snapshot of the signal, in contrast to digital sampling, where the number of bits to allocate for lower-amplitude signals is fewer than for higher amplitude, and the potential for aliasing artifacts greater. So, I'll pull some numbers out of my hindquarters as an illustration. By his reasoning, a 16-bit A->D conversion would have 14 bits available to encode the peak (we'll save the remaining 2 bits for huge transient peaks), but only 8 bits to encode the 3rd decayed repeat, which would sound more "stair-steppy" and gritty as a result.
That's an interesting argument, and makes some degree of sense...in 2003, when the interview was conducted. Does it hold any water now, however, given how the technology has changed in the last 15 years? At present, we are well beyond the resolution that was deemed sufficient for CDs (14-bit resolution, 44.1khz sample rate) in most commercial effects. What may well hold true for 16-bit, 48khz sample rate, may be moot at 24-bit/96khz sample rate. Even if we save the 5 most significant bits for those pesky peaks, we've still got 19 bits left, which is substantially greater resolution than CD quality, and probably leaves 14 bits for encoding the decay.
The other aspect that has changed is the algorithms applied. We know more about how delay signals change with recirculation and filtering in analog delays, and how tape saturation works in tape-based systems. So we can expect more realistic digital modelling of traditional analog "real" delay systems. So I think your reticence, and accompanying argument IS valid for a given historical period, but is increasingly less valid as the technology changes. I'll close by noting that I am generally dissatisfied with the decays in analog delays as well. I always mod mine to shave off a little more top end with successive repeats, so that echoes get duller as they go along. To my ears, this sounds more natural. I guess that means that, for some of us, "real" delays could stand some improvement as well.
I sold off a couple of 1980s and '90s digital delays recently. I did not raise that concern with the buyers.