Loading ...
Sorry, an error occurred while loading the content.

17430Re: HUM_FORUM: Comments on study abstract...

Expand Messages
  • Copsne
    Sep 21, 2013
      What if the term doesnt matter. The fact is something low frequency screwed this guys thoughts up. Just like the hum in Lanza's neighborhood. Except DC guy gave doctors the evidence. 

      Sent from Steve's iPhone and I appologize for typo's

      On Sep 21, 2013, at 4:22 PM, Jim <w4jbm@...> wrote:


      > I evidently need a lesson in this, as I found this article below and it does mention “extremely low frequency microwaves”.
      > http://www.ncbi.nlm.nih.gov/pubmed/15586889

      It does get confusing because there are so many references that just assume you have a degree in physics, medicine, or engineering. Actually there are parts of this that don't seem to make sense on their own. I would need to look at the complete study to better understand what they are talking about. But, as you point out, the conclusion says:

      "The data obtained provide additional evidence that repeated low-level exposure to extremely low frequency microwaves can modify an activity of cholinergic system in the brain."

      So there is a pretty explicit reference as you mention. But looking back there are parts I can explain.

      "Averaged frequency spectra (0.5-30 Hz) of the electroencephalogram were studied in freely moving rats with carbon electrodes implanted into the somatosensory cortex."

      Basically what that means is that they were taking an EEG of the rats. I'm not an expert in that area, but basically an EEG would measure the electrical impulses using probes on the scalp. This would give you a view of brain activity. They did look only at sub-audible frequencies (the 0.5 to 30 Hz they mention means signals that repeated every two seconds up to speeds of thirty times per second). I think those are fairly normal frequencies to study brain activity while avoiding noise. (For example, you couldn't get a meaningful reading up at 50 Hz in Europe or 60 Hz in North America because those are the frequencies the power grid operates on. The sensors would pick up nothing but noise unless you had significant shielding. Also my understanding is that most meaningful brain activity that we can measure happens at those lower frequencies anyway.)

      "The rats were repeatedly (3 days, 30 min day(-1)) exposed to low-intensity (approximately = 0.3 mW cm(-2)) microwaves (915 MHz, 20-ms pulse duration), amplitude modulated (square-wave) at extremely low frequency (4 Hz)."

      Parts of this are clear, but other parts don't make sense. Starting at the first, they were studied over three days and zapped with microwaves for 30 minutes per day. That is reasonably straight forward. It doesn't seem like much time. I'm not sure why the (-1) is in there and there could be some symbol that is getting left out.

      Let's talk about "low-intensity". The units seem messed up, but I think they mean 0.3 milliwatts per cm^2 (centimeter squared). That is the standard unit for measuring RF exposure of humans. Is that really "low-intensity"?

      There are standards for human exposure. A pretty decent (and understandable) write up is on the site of the national organization for radio amateurs (the American Radio Relay League or the ARRL for short).


      If you look at Table 1, you find that for frequencies for 300 to 1500 MHz, the exposure can be f/300 (where f is the frequency in MHz) for controlled exposure or f/1500 for uncontrolled exposure. Controlled and uncontrolled exposure could use some elaboration. Basically controlled means a person would be aware of it and could minimize the time (to under six minutes) while uncontrolled means the person might be unaware they were being exposed (and the assumption is that the duration of exposure would be under 30 minutes.

      In the context of ham radio, that makes sense. If I'm adjusting an antenna to aim it at some distant site, I may stand near the antenna and manually adjust the mast. In doing that I could be a few feet from the antenna. Say it's on the roof of my house. Once it's adjusted I climb down my ladder and I'm further away. But say it points right towards some spot in my neighbor's house. They could "walk through" the radio frequency path and not even know it. So while the time adjusting would be "controlled" exposure, a neighbor might experience "uncontrolled" exposure. In reality, even the most long-winded ham probably isn't going to hold the key down and talk for ten minutes straight. Ham radio operators (and other operators of radio frequency systems) are responsible for knowing the limits and ensuring they don't cause exposure that exceeds them. And, generally, other than examples like adjusting the antenna the power levels used at these frequencies tends to be low enough that uncontrolled exposure isn't a problem. (And I just use the ham radio example because I'm familiar with it.)

      Back to the study... The study was done at around 900 MHz (I'm going to round to keep it simple), so the maximum permissible exposure for a person at that frequency would be 3 milliwatts / cm^2 for "controlled" exposure and 0.6 milliwatts / cm ^2 for "uncontrolled" exposure. Radio power drops off pretty dramatically with distances, so the exposure they are talking about is pretty near the maximum allowed. (Again, the details get messy, but trust me that most radio frequency engineers aren't going to view 0.3 vs. 0.6 as a big difference. In RF terms that is 3 dB (decibel) difference in power.) I wouldn't term it "low-intensity" like they do, but it is below the allowable intensity which is probably what matters.

      So I claim that 3 dB (half power) isn't a significant loss. Is that reasonable? There is a thing called "free space loss" you deal with in RF. Basically if I send a single from Point A to Point B and there is nothing but air between them I see some (significant) loss. This is a logarithmic relationship--in other words going twice as far doesn't mean twice as much loss, it mean a lot more. From a non-directional antenna the radio waves are spreading out in a circle, so just from that you see a loss that is related to the square of the distance (because the surface area of the wavefront sphere is increasing by the square of the distance).

      Just looking at some quick calculation I think you'd lose about 60 dB over a 1 km path at 900 MHz. R
      emember, each 3 dB means the power has been cut in half. 10 dB actually cuts the power by one-tenth (move the decimal place to the left once in other words). So, again,  0.3 mw/cm^2 is lower than the maximum allowed uncontrolled exposure, but it is much, much higher than you'd ever likely encounter walking around in the real world. (I am talking things like tower mounted base stations now. Holding walkie talkies or cell phones to your head is another topic for another time.)

      I could look at the 20 millisecond pulse and understand it -or- I could look at the 4 Hz modulation and understand it. Together I'm not exactly sure what they mean. If the pulse wasn't mentioned, the 4 Hz square wave modulation would basically turn the microwave signal on and off four times each second. From having messed around with RF, I would venture a guess that the pulse means that what they are actually doing is triggering the 20 millisecond pulse four times each second. (That is a guess based on my experience with RF equipment you'd typically find in a laboratory.)

      I do have a bit of a problem with that. The math is deeper than I could sketch out on a sheet of paper, but if you look at the Fourier transform of that kind of signal you are going to have frequency components related to 4 Hz (and it's harmonics--with a square wave primarily the odd harmonics like 12 (4*3) Hz, 15 (4*5) Hz, 28 (4*7) Hz, etc., related to 50 Hz (which is related to the period of the 20 millisecond pulse) and it's harmonics (both odd and even for a pulse), and then the 915 MHz carrier signal.

      Honestly, if I were building an experiment I would rather have fewer variable so I could be more certain about what was causing any effect I might measure. But being realistic, they probably made do with the types of lab equipment they had available to generate the signal.

      I'm not a biologist at all, but the summary of the conclusion is interesting:

      "The exposure to extremely low frequency microwaves alone significantly enhanced the fast electroencephalographic rhythms (18-30 Hz). This effect was observed neither in subsequent sham-exposure experiment nor in radiation-naïve animals. In the microwave-exposed rats, scopolamine (0.1 mg kg(-1), subcutaneously) did not cause a slowing in the electroencephalogram that was shown in non-exposed rats. A similarity between the scopolamine-induced electroencephalogram effect in the microwave-exposed rats and that of physostigmine (enhancing the acetylcholine level in the brain) in radiation-naïve animals was noted. This paradoxical phenomenon stimulates new experimentation for understanding its mechanism(s)."

      This is saying that if you take two rats and expose one to microwave pulses, they see an enhancement (does that mean increase in strength?) in the fast (higher frequency, those in the 18 to 30 cycle per second range) EEG in the exposed rat.

      Someone else will have to give the scoop on scopolamine. (I looked it up on Wikipedia, so that's as much as I know...) Apparently when used on rats that had not been exposed to microwave pulses, it slowed their EEGs but it did not have that effect in rats that had been exposed to microwave pulses.

      The third part seems to say that this same lack of response to scopolamine was seen both in the rats exposed to microwave pulses and to rats that weren't exposed to microwave pulses but were given physostigmine. I looked that one up also but still don't have a clue why that particular result was called a "paradoxical phenomenon". This is totally a guess, but it seems like they found the fact that microwave pulses and physostigmine injections gave similar results interesting and something they thought could use further research.

      A couple of observations...

      They call them low frequency microwaves, but I think they really mean pulsed microwave where the pulsing is done at a reasonably low frequency. (This would fall within the definition of LF Microwave as the term seems to be used by this group and like I offered a potential definition in my earlier note.)

      Also, the EEG results offer a specific physiological response that can be measured. I doubt anyone has an EEG machine laying around, but if The Hum is caused by pulsed microwaves of some sort then you would expect to see increased EEG activity at the same time. (I always hesitate to assign causality. It can be a chicken and egg thing. Does increased EEG activity cause the symptoms some people observe as The Hum? Does agitation from The Hum cause some people to experience distress that leads to increased EEG activity? Saying it is one way or the other would just be a guess. That is assuming that you even found that there was a correlation between The Hum, pulsed microwaves, and increased EEG activities in at least some of the people who suffer from The Hum.)

      A final observation is that even scientist can get a bit loose and free in their definitions. (Like talking about low frequency microwave signals.)

      Looking back, I apologize if this seems like a lecture. My personal feeling is that there are several causes for The Hum and that they are likely to be discovered one by one over time. I ramble when I type, but I also hope that something somewhere might trigger an "ah-ha" for someone else. I also would rather be explicit in my thinking so that people can correct me if I've gone astray someplace or knowledgeably agree with points if they understand them. I would definitely be interested in other people's thoughts on the subject and on the study that was referenced.


      Reply via web post Reply to sender Reply to group Start a New Topic
    • Show all 24 messages in this topic