But It's a Good Story!

Wednesday, October 30, 2013


Believing Everything You Hear

Today marks the 75th anniversary of a landmark event in American history -- a fascinating event which seemingly serious people are still talking about all these years later, even though (spoiler alert!) it's a total fraud. For an event to have such a lasting hold on the American imagination, when in it didn't even happen, is an achievement worth pondering.

I am talking, of course, about the chaos stirred up by a young Orson Welles on Sunday evening, October 30, 1938, with his Mercury Theater radio broadcast of "The War of the Worlds", a dramatization of the H.G. Wells novel about invaders from Mars. The broadcast presented the story as if it were actually happening that night, with fake news bulletins pouring in from the New Jersey town where the murderous aliens had supposedly landed.

The next day, extremely sensational reports appeared in the nation's newspapers (and not just the tabloids -- the New York Times got into the act, too) claiming that the broadcast had touched off a nationwide panic. The roads were choked with would-be refugees from the Martian conquerors. There were reports of people dying of fright, or becoming suicidal, as a result of the deception. Strenuous efforts were made to crank up national outrage against Orson Welles and CBS for perpetrating this deeply irresponsible deception on the American public.

Don't misunderstand me: when I say that the event didn't happen, I'm not referring to the Martian invasion (although, of course, that didn't happen either). The more interesting aspect of the story is that the nationwide panic didn't happen. The lurid newspaper stories about the supposed panic were remarkably short on specifics that could be checked (people who supposedly committed or attempted suicide were not identified by name, and the hospitals where such people supposedly were taken denied having handled any such cases). From that day to this, historians have failed to unearth any evidence of a nationwide panic, or even any small-scale local panics.

How could the show have touched off a panic anyway? First of all, not many people heard it. The program had low ratings -- it was up against a far more popular variety show. Most Americans that Sunday night were listening to another show, or not listening to the radio at all. Of those who did hear the broadcast, most understood that it was a fictional story, not the evening news. I have heard the broadcast, and although the fake news-bulletins in the first part of it are an exciting way to tell the story, they are not carried through the whole program, which is clearly a dramatization. Only someone who tuned in late to the broadcast, and turned off the radio well before it was over, could have been fooled. And even in that case, they only needed to tune to another station, or look out their damned front door, to realize that nobody else was in a panic and nothing weird was going on in the real world. So, if anybody did panic, they were a tiny minority of that minority which even heard the broadcast in the first place. The lack of real evidence of a panic is most reasonably accounted for by there not having been one.

It is also telling that newspapers which had covered the "panic" so sensationally the day after the broadcast simply dropped the story within a day or two -- perhaps embarrassed by the letters to the editor that were coming in, from puzzled citizens who were asking "What panic?", and were reporting that, contrary to newspaper accounts, the roads had been empty that Sunday night, not crowded with citizens fleeing the Martian army.

The only support for the "panic" story comes from a respected (but apparently terminally naive) Princeton historian named Hadley Cantril, who commented on the panic two years after it supposedly happened and, unfortunately, legitimized the legend. He looked at some pretty questionable survey data on whether or not listeners to the broadcast said they had been "frightened" by it, and assumed that if they said they were "frightened" by it, that they meant they thought it was real. Huh? A lot of people were "frightened" Jaws and The Exorcist and Alien, but that doesn't mean they mistook these films for documentaries.

So, there is no reason to believe that the famous Martian panic of 1938 ever happened. And yet, people love the story of this panic, and continue to tell it, and enjoy hearing it. (I love the story myself; I was brought up on it, and I'm terribly disappointed that there's no reason to regard it as fact.) Documentaries continue to be made about this story, repeating sensational nonsense that has long been known to be nonsense.

So what interests me about this story is how a myth becomes a myth. How does a false story become loved and endlessly retold throughout a culture which ought to know better?

Probably two things are needed to get a myth as firmly established as this one is: first, there has to be somebody whose interests are served by having the story believed, and second, the story has to be appeal to people in some way, so that they want to buy it whether it's true or not.

Whose interests were served by this myth? Obviously the newspapers that started it. Why would the newspapers want to create a phony scandal out of a panic that never happened? Because newspapers were feeling threatened by the radio industry (just as newspapers today feel threatened, with a bit more justification, by the internet). Newspapers are always feeling threatened by technologies which give people alternative sources of news, and this was no less true in 1938 than it is today. Newspapers feared radio as a competing medium for news, and in the Orson Welles broadcast they thought they had found a golden opportunity to strike a blow against their hated rival. They wanted to convince the public that broadcasters were irresponsible and should never be trusted as a news source. A dangerous nationwide panic over nothing -- this is what happens when people accept broadcasters as journalists! It's hard to prove that this is what motivated the "panic" stories, but it's extremely plausible as an explanation for something which doesn't otherwise make sense. (And look at how eagerly newspapers today report any story about the internet causing trouble!)

But the newspapers, having started the myth, dropped it almost immediately. So what kept it going? Why did people want to believe in this legendary panic, and why do they still want to believe it?

In my own case, part of the appeal when I first heard the story was that I liked any story which seemed to prove that people were a lot more stupid when my parents were young. Any story which tells us that other people are dumb has a powerful built-in appeal. However, the pleasure of looking down on the gullible can't be the whole explanation for this myth's popularity, because the myth was partly kept alive by those who bragged about having been among the gullible themselves. I think it is significant that, as time went on after the broadcast, the number of people who claimed to have heard it expanded improbably over the years, just as the number of baby-boomers who claimed to have gone to the Woodstock music festival expanded improbably over the years. The Martian panic, like Woodstock, came to seem too exciting and significant in retrospect for people to want to say they had missed the whole thing. They went from feeling that they ought to have been involved in it to saying that they actually had been involved in it.

I read about a psychology professor who, the day that the Challenger space shuttle exploded in 1986, asked his students to write down a description of how they heard the news of the disaster and how they and others reacted. He collected and saved these accounts. A year later he asked his students to repeat the assignment. The accounts they wrote a year later were so much more dramatic and emotional that, when he confronted the students with the mundane, matter-of-fact accounts they had written when the incident was still fresh in their minds, they were stunned and confused by the contrast. Their memories had merged with later press accounts and public sentiment about the event; they "remembered" a drama of which they were unaware when it was supposedly happening, because in retrospect it seemed too important for there not to have been drama. I think something similar happened with the Martian panic: in retrospect it seemed as if one ought to have been excited about it (or at least conscious of it, for crying out loud!).

Anyway, that's how I think it happened. Newspapers rushed into print with unconfirmed rumors about a scandalous situation, because they thought the scandal would benefit them. And although newspapers dropped the story, everyone else tended to cling to it, because it happened to appeal to them for reasons unrelated to the reason it appealed to the newspaper editors who got the ball rolling.

The reason I think all of this is worth pondering is that untrue-but-unkillable stories are not confined to newspaper coverage of competing news media. Notions of what is healthy for us, particularly in the eternally problematic area of nutrition, tend to be launched by someone with an axe to grind and a tendency to cherry-pick data... and are then kept alive for decades by whoever happens to find these notions appealing.

The notion of dietary fat, especially saturated fat, as our nutritional arch-enemy, the promoter of heart disease, the evil magnifier of LDL cholesterol, seems to have been launched in that way, and kept alive in that way, and it doesn't look as if it's likely to be true. Because I have had to restrict carbohydrates to maintain glucose control, I have been obliged to eat a diet which is higher in fat than most nutritionists would ever accept, and although I used to try to avoid saturated fats, I stopped bothering with that, and my cholesterol readings still look great. (What seems to make the difference for me is exercise; I haven't had bad cholesterol readings since I started exercising regularly, though I certainly did before that.)

The nutritionists could certainly have the last laugh at my expense, if I have a heart attack despite my normal cholesterol readings. But they can't have it both ways: they can't say dietary fat gives you heart disease by driving up your LDL cholesterol, and then say dietary fat is the reason for my heart attack even if it never succeeded in driving up my LDL cholesterol at any point in the last 12 years. Anyway, the hypothesis that dietary fat equals high LDL equals coronary heart disease is sort of like the Martian panic: we've all heard about it for many years, and we've all assumed it was true, but maybe it's just a story that a lot of people, for whatever reason, happen to like.


When Is Good News Bad News?

Tuesday, October 29, 2013


The Transplant Donor Problem

I remember an old Monty Python sketch about a transplant surgeon lamenting the shortage of donors: "There simply aren't enough accidents! And it's unethical and time-consuming to go out and cause them...". I love that phrase: unethical and time-consuming.

I was reminded of that upon reading this lede in the Gupta Guide: "Neurologic death became progressively less likely among individuals suffering brain injuries in western Canada from 2002 to 2012, with potentially worrisome implications for organ transplantation, researchers said."

Less brain-death going on out there? That's terrible! Or at least it's "potentially worrisome".

One man's good news is another man's bad news. Improved treatment of head-trauma patients is a welcome development for head-trauma patients, but clearly not so welcome to those who had called dibs on their hearts and livers. I'm sure that any improvement in motorcycle safety would be regarded as a problem by the transplant surgeons who refer to those fuel-efficient but high-risk vehicles as "donorcycles".

I guess every situation that has a loser also has a winner. There's bound to be a word for that phenomenon... schadenfreude, or capitalism, or something of the sort. Anyway, good news for you is bad news for somebody else, and your tragedy is somebody else's profit opportunity. The all-normal lab results I got yesterday serve only to crush the dreams of somebody who was wanting to get hold of my (apparently very healthy) kidneys. Just think of it: every time you drive home from work safely, you ruin somebody's day!

I guess that's why we need to get the stem-cell research projects better funded: we need to get out of this situation where, at least in some quarters, we're considered to be worth more dead than we are alive, and our continued survival is considered to be an annoyance, especially by people who think they would put our kidneys to better use than we are doing.

My own feeling about transplantation has long been that we should assume we have to get by on the organs we were born with, and nobody else owes us any spare parts. But I probably feel that way mainly because I have never been in a situation where I needed an organ transplant. If I did need one, it might change my perspective on the issue dramatically. There's nothing quite like self-interest to inspire a change of attitude.

In the past I have participated in the annual running event called the Golden Gate Relay, which raises money for organ donation programs. At the start of the race there was a speech by a young woman who was alive only because of an organ transplant. The fact that she was young made the cause seem more sympathetic: I had been appalled by the situation in which Micky Mantle (who had destroyed his liver by prolonged alcohol abuse) was moved ahead of everyone else on the waiting list, even though he had cancer and was unlikely to survive anyway (he lived for only two months after the transplant, while the people he elbowed aside might have lived for years). So it was nice to be reassured that the organs sometimes go to the right people, at least when no sports celebrities are competing for them.

Still, it would be better if we could create a situation in which we don't have to be in a competition to decide who gets to live. I don't expect to see stem-cell technology solve that problem soon, but the rumor mill says they're making surprisingly rapid progress. Someday there might be a way to make sure that there's a spare pancreas ready for you and the celebrity who wants one too.


Diagnosed as Awesome

Monday, October 28, 2013


My Lab Results

Last Friday morning I drove over to my friendly local medical lab to hand over a little more of my blood (and my urine, which in previous years had not been in demand) for analysis. Last night I received notification that my results had come back and my doctor had reviewed them. His message:

Your labs are awesome. Everything is normal. You are doing a great job. I talk about you with my newly diagnosed diabetics (not your name of course), how one can change lifestyles and fend the diabetes off.

So! Well into my 12th year with diabetes, I have been diagnosed with awesomeness. The conventional wisdom (too widely dispensed, if you ask me) says that 10 years is the longest that anyone should expect to be able to keep diabetes under control through lifestyle changes. To continue two years beyond that supposed limit qualifies you as awesome. I wonder if, when I get to year 15, I will graduate from merely being awesome to being an official freak of nature. But I shouldn't talk about that before I've actually done it, should I?

I always worry about my lab results, but I had an especially eager interest in this year's results, because last year there was a result I didn't like. My hemoglobin A1c last year came in at 5.8%, which is not considered a diabetic level, but is slightly above the 4.8-5.6% range which my lab defines as normal for their version of the test. In the years since I adopted my lifestyle program for diabetic control, last year's A1c result was the first one that was not in the normal range. It didn't upset my doctor any, but it worried me because I was afraid it might be the start of a bad trend. I think I had been too much affected by the conventional wisdom, too ready to believe that diabetes always gets out of control no matter what you do.

Realistically, I knew that my slightly elevated A1c result last year probably had something to do with my two-week trip to rural Ireland, where low-carb food choices were decidedly limited... and yet I couldn't help but see that 5.8% result as a sign that things were starting to slip beyond my control. What if I couldn't reel it back in? My daily glucose tests were looking good, but that doesn't tell you everything; you might go high while you're sleeping even if your daytime results seem normal. The A1c result can always surprise you, at least a little. So I really wanted to a result in the normal range this time. And I got it: 5.5%!

My fasting glucose result was 90, which is of interest mainly because I could use it as a check on the accuracy of my meter, which had said 87 earlier that morning, and then said 96 shortly after my lab sample was collected. 90 is more or less in the middle of those two values, so the meter obviously comes reasonably close, but isn't perfect. Which is pretty much what I've learned from previous comparisons.

The blood-lipids test (a potential warning indicator of atherosclerosis, coronary heart disease, and stroke) is a problematic one for a lot of people, but it never has been for me since I started my lifestyle program, and it wasn't a problem for me in this year's results either:

My doctor didn't order the PSA (prostate-specific antigen) test for me this year, but I've always tested normal for it in the past, and from what I've read it has come to be regarded as not an especially useful warning indicator for prostate cancer. Probably my doctor has decided to rely on the hands-on approach instead (which he'll get to put into practice on November 11th, when I go in for my annual physical).

The new test this year was the urine microalbumin test, which looks for tiny traces of the blood protein albumin in the urine. A properly functioning kidney keeps blood proteins in the blood, where they belong, instead of excreting them. But a diseased kidney often leaks blood proteins into the urine, and the first sign that this problem is developing is that microscopic amounts of albumin show up in your urine sample. Anyone who's had diabetes for as long as I have is considered to be at risk for kidney disease, which must be why my doctor added that test this year. The test results are reported in extremely confusing terms, but I guess the essential thing is that "normal" is anything below 30, and my value was 4. So my kidneys are apparently doing fine.

Well, there you have it: apparently I'm awesomely normal. Not everyone whose health is normal gets to be called awesome for it, but I guess the awesomeness lies in being normal when the conventional wisdom says that, for someone like you, normal should be out of reach.


Sleepless Nights

Thursday, October 24, 2013


Why Does Sleep Matter?

It has been known for years that sleep deprivation has a connection with Type 2 diabetes and other health problems -- not only do population studies show that people who get less sleep are more likely to develop diabetes and heart disease over the years, scientists conducting short-term studies of sleep deprivation have directly observed startling increases in blood sugar levels in healthy young volunteers. However, most studies that have looked into this phenomenon have simply confirmed that it happens, without providing any insight into why it happens. Scientists always wrap up their reports saying that "further study" is needed, but it seems to me that further study of the link between sleep deprivation and Type 2 diabetes is pointless, if by further study we mean collecting further proof that the link exists. However, if further study means trying to figure out the nature of the link and how it operates, further study is definitely worth doing.

Not that it's likely to do me any good. I have come to the sad conclusion that my own troubled relationship with sleep is more of a personality problem than a medical problem. The truth is that I don't like sleep and never have; ever since I was a child I have resisted falling asleep, partly because of a mental tendency to equate sleep with death (or at least with wasted time), and partly because of a biological-rhythm problem which causes me to be sleepy when I ought to be alert and vice versa. Late at night feels to me like the natural time to read a book. I realize that lots of people will read a chapter at bedtime to get themselves ready to nod off, but in my case that nodding-off moment may not arrive so soon. No book is half as fascinating to me while the sun is up as it is when I ought to be sleeping. But even if I put down the book and turn out the light, my brain stays active.

So, any discoveries about how sleep deprivation harms our health is likely to come to me as a sad reminder of the price I'm paying for having such an eccentric relationship with sleep. I doubt they can discover anything which will scare me into revising my entire personality. Still, I'm at least interested in knowing what scientists are able to learn about this.

Researchers at the University of Helsinki looked at the effect of sleep deprivation on gene-expression. (Although most of us tend to think of a gene simply as something you have or don't have, a gene is an active agent within the cells, and it matters how active, or inactive, it is.) Sleep deprivation has a tendency to "up-regulate" or activate some genes while "down-regulating" others. The Helsinki scientists looked at hundreds of genes that are affected by sleep deprivation in these ways by sleep deprivation, to see what the impact of such changes was on the body.

It turns out that sleep-deprivation up-regulates genes that promote an inflammatory response. To be chronically sleep-deprived is to to be in a chronic inflammatory condition.

This matters because chronic inflammation promotes the very same health problems which sleep-deprivation has been linked to. Inflammation promotes insulin resistance and therefore Type 2 diabetes. Inflammation also promotes atherosclerosis and therefore coronary heart disease.

So the new findings about sleep deprivation and gene expression certainly go some distance towards making sense of earlier research findings.

I'm not sure the "why" question is really being answered here, however. Why does sleep deprivation alter gene expression in a way which provokes an inflammatory response? Is this purposeful in some way? Is it an accident? How do the genes, or whatever is regulating them, know that you're not getting enough sleep?

These questions remain unanswered, and I suspect we may be waiting for the answers for a long time. But I'll try not to lose any sleep over it.


Sugar Amnesia

Wednesday, October 23, 2013


Hyperglycemia and Memory Impairment

According to the American Academy of Neurology, higher blood sugar is associated with memory problems -- even in people who aren't diabetic. A study which looked at 141 people (average age 63) who did not have diabetes, pre-diabetes, or impaired glucose tolerance, found that those whose blood sugar tested higher did worse on memory tests.

One test involved recalling a list of 15 words, 30 minutes after hearing them. On this test, an individual whose hemoglobin A1c test score was 0.7% higher would typically recall two fewer words. Also, those with higher blood sugar levels had a smaller hippocampus volume. So you see.

I should add here that the hippocampus is a part of the brain in vertebrates, and it is associated with memory functions -- at least by scientists, who aren't distracted by the laughability of the name "hippocampus". I looked up the origin of the name: it comes from hippos (Greek for horse) and kampos (Greek for a sea-creature). Clear so far? Well, perhaps not absolutely. Why refer to a part of the brain in such terms? Because the hippocampus happens to be shaped like a seahorse, obviously! If you doubt this explanation, you can check out the illustration below, to see how amazingly seahorse-like the hippocampus is.

Well, on taking a second look at the illustration, I'm starting to think that doctors used the Greek for seahorse because the Greeks didn't have a word for bratwurst. Or perhaps the doctors involved in the decision-making had elevated blood-sugar themselves, and this caused a bit of embarrassing verbal confusion.

Well, anyway, if you don't want the bratwurst inside your brain to shrink, until you start forgetting words and misusing them and saying "seahorse" when you mean "sausage-like", you want to keep your blood sugar under very good control. Even within the normal range, it's better to be in the lower half of the range, apparently because the risk of sausage-shrinkage is lower.



Tuesday, October 22, 2013

I had a meeting at work which didn't start until 6 PM, and it wasn't a meeting I was looking forward to, because I knew we were going to be discussing frustrating problems which, at least for right now, we can't really do anything about. Since the problems aren't solvable and simply have to be endured for now, I knew there wouldn't be much for anyone to do in the meeting except complain that the unsolvable problems were affecting them more than they were affecting others. An hour of competitive complaining, in other words. I expected to contribute my own complaints to the contest, but I knew it wasn't going to make me, or anyone else, feel a lot better. So I wasn't looking forward to the meeting.

With the meeting starting so late, I decided to delay my usual lunchtime run until the late afternoon, so that I could get in a hard workout to prepare myself spiritually -- and start the meeting mere minutes after getting out of a hot shower. (I did pause to dress, I hasten to add.) The route was hilly and scenic, and let me look down on my workplace from a ridgeline, with a brilliant sunset behind it.

It worked: I was felt good physically going into the meeting, and I felt calm and relaxed during it. It wasn't a thrill, but it didn't bother me either.

Exercise: the good workplace drug!


Is Type 1 a Viral Disease?

Type 1 diabetes is not what I usually write about, but I cover it from time to time, if only to provide a clarifying contrast with Type 2. Also, the uncertainty surrounding its exact cause gives it a little of that unsolved-mystery interest which makes it easy to get intrigued by any new theory about it. Some people get wrapped in trying to understand who Jack the Ripper really was, or what really happened to Amelia Earhart, or who really wrote those plays that were published under the seemingly fictitious name "William Shake-speare"; other get wrapped up in trying to understand what causes someone to develop Type 1 diabetes.

And there is something awfully weird about how Type 1 typically shows up in someone's life. The disease is sometimes called "juvenile" diabetes because it usually appears during childhood, and seldom appears any later than age 25. Why would that be? There's a genetic component to Type 1, but if you're simply born doomed to have the disease because of your genes, why don't you have it from birth? It's an auto-immune disease, meaning that your immune system goes haywire and attacks the beta cells in your pancreas as if they were invading bacteria. If that's going to happen at all, why doesn't it happen from the start -- or, if it could happen at any time, why isn't it just as likely to happen after reaching adulthood as it was before? It's strange that the danger would dwindle to nothing in early adulthood -- why should that happen?

The most plausible explanation for the reduction of risk over time is that only some people are genetically vulnerable to having the auto-immune reaction at all, and most of those who are capable of having that reaction have had already had it by the time they reach adulthood. The fact that they don't all have that reaction in infancy suggests that something more than a genetic vulnerability is required to make the reaction happen -- there has to be some kind of triggering event, such as exposure to a virus, which makes it happen. If it is a virus that makes it happen, presumably most children get exposed to that virus somewhere along the road to adulthood; only a few escape exposure until they are 25 or older. After age 25, almost everyone who was primed to react to that virus has already been exposed to it, so new cases of Type 1 after that age are rare.

Now researchers in Finland have identified a group of viruses which may be the culprit. A set of "enteroviruses" known as the "group B coxsackieviruses" are associated with increased risk of Type 1 diabetes. (Based on the name, I would have thought the coxsackieviruses would cause trouble in an entirely different area, but I assume the researchers know what they're doing.)

But then I read this odd sentence in their report: "These findings are in line with other recent reports suggesting that group B coxsackieviruses can spread to the pancreas and damage the insulin-producing cells." That's ambiguously worded, but it seems to mean that Type 1 is not an auto-immune disease at all, and is caused more simply by a virus which directly does damage to the beta cells in the pancreas. Or is this just a shorter way of saying that the viruses can spread to the pancreas and trigger an auto-immune reaction there which damages the insulin-producing cells?

I checked various medical sites to see if they've been expressing any doubts that Type 1 is an auto-immune disease. Some haven't been: Johns Hopkins and JDRF unambiguously call it an auto-immune disease, as does that font of all human knowledge, Wikipedia. But the Mayo Clinic site leaves a little room for doubt: "The exact cause of type 1 diabetes is unknown. In most people with type 1 diabetes, the body's own immune system -- which normally fights harmful bacteria and viruses -- mistakenly destroys the insulin-producing (islet) cells in the pancreas. Genetics may play a role in this process, and exposure to certain viruses may trigger the disease." It could be that the group B coxsackieviruses trigger the auto-immune reaction; it could also be that they have a more direct impact, and damage beta cells without causing an auto-immune reaction.

Maybe Type 1 is like Type 2: a label for two or more diseases which are similar but not identical, and have different causes. Maybe a virus sometimes does direct damage to the pancreas, and sometimes does indirect damage by triggering an auto-immune reaction.

You might think it doesn't matter what damaged the beta cells in your pancreas, so long as they're damaged (and not producing insulin). But it could make a big difference in implementing new treatments for Type 1. If you want to implant new beta cells, or a new pancreas-like structure grown from stem-cells, it's important to understand the mechanism that originally caused the damage you're repairing -- because you don't want it to happen again!


Reading the Fine Print

Monday, October 21, 2013


My Poetic Weekend

I went to a friend's sixtieth birthday party on Saturday. It sounded like a startlingly old age for someone I thought of as a contemporary to be reaching, until I thought, "Wait a minute, I'm already fifty-five myself!". (And, as if that were not humbling enough, I later remembered that I'm not fifty-five, I'm fifty-six.) Anyway, I was pondering the milestones of advancing age, and trying to think of something amusing that I could say about them at the party. Eventually I distilled my thoughts into a poem which I read after dinner:

The milestone birthdays mark symbolic waypoints that we've crossed. The early ones mark what we've gained, instead of what we lost:
our sixteenth birthday's not a time for anything but fun,
and no one heaves a heavy sigh on turning twenty-one.

At thirty, though, we're looking back. Now, if we concentrate...
can we recall a time when we had trouble gaining weight?
At forty, we can't help but feel that time has shown its traces:
the parts of us that once were hairy, and were not, trade places!

At fifty, we can satisfy our morbid curiosity
concerning what it's like to have a full-length colonoscopy.
Those are the milestones I have hit, and that is what befell me.
Whatever sixty's like, I'm hoping Robin will not tell me!

It went over well.

The next day I felt it was time for me to do a long run in the state park; it used to be my regular practice to do a trail-run in the seven-to-nine mile range every weekend. My recent poison-oak experience killed off some of my enthusiasm for the great outdoors for a while there, but I figured it was time to push past that mental barrier. I was afraid I would be weak at it from not getting much practice recently, but to my surprise I felt quite energized. I did an 8.5 mile route and didn't feel worn out at the end.

My hips felt a sore, though, for today's run and tonight's yoga class. So there's a reminder of the passage of time!


Choosing Wisely

I was all ready to get angry at the Endocrine Society and American Association of Clinical Endocrinologists, based on what I was reading about their contribution to the Choosing Wisely initiative, and maybe I will after all, but it's not quite clear to me, from my reading of the fine print, whether I should get angry or not.

The 'Choosing Wisely' program defines its goals as follows:
"To promote conversations between physicians and patients by helping patients choose care that is:

Under 'Choosing Wisely', organizations of doctors that specialize in a particular area of medicine get together and issue lists of things that doctors should be discouraged from doing for patients, and patients should be discouraged from requesting. For example, the list for ophthalmologists includes a recommendation not to "routinely order imaging tests for patients without symptoms or signs of significant eye disease", and it specifically mentions "retinal imaging of patients with diabetes" as the type of test which doesn't need to be done, because if the patient is developing retinopathy it will eventually become obvious in other ways. (I haven't checked the oncologist's list to see if they say the same thing about tumors). Anyway, 'Choosing Wisely' is a program which seems to be aimed mainly at cutting costs wherever doing so will harm only a minority of patients. And, because diabetes patients are notorious for soaking up health-care dollars, it shouldn't be surprising that some of the 'Choosing Wisely' lists seem to go out of their way to suggest cutbacks in treatment of diabetes patients.

So when I read that the Endocrine Society and American Association of Clinical Endocrinologists had come up with a 'Choosing Wisely' list of their own, and that the number one item on the list was "Avoid routine multiple daily self–glucose monitoring in adults with stable type 2 diabetes on agents that do not cause hypoglycemia", I thought, great, here we go. The most effective tool there is for empowering people who want to manage their diabetes effectively, and the endocrinologists want to take it away from them. The reason for this, at least according to Medscape, is that widespread glucose testing among diabetes patients has been "financially damaging, as until recently there had been widespread overbilling of Medicare for unnecessary glucose test strips for beneficiaries." That comment was linked to a report from the Department of Health & Human Services, and the supposedly scandalous findings there consist, so far as I can see, entirely of complaints about doctors not filling out the right paperwork justifying why the patients needed their test strips. To the bureaucratic mind this is no doubt a crime that cries to heaven, but it doesn't make my blood boil; I'm not nearly as likely to get angry over patients being helped as I am to get angry over patients not being helped.

However, when I consult the actual text of the 'Choosing Wisely' list for endocrinologists, the full wording of Item 1 is a little more nuanced than I had expected:

"Avoid routine multiple daily self-glucose monitoring in adults with stable type 2 diabetes on agents that do not cause hypoglycemia. Once target control is achieved and the results of self-monitoring become quite predictable, there is little gained in most individuals from repeatedly confirming. There are many exceptions, such as for acute illness, when new medications are added, when weight fluctuates significantly, when A1c targets drift off course and in individuals who need monitoring to maintain targets. Self-monitoring is beneficial as long as one is learning and adjusting therapy based on the result of the monitoring."

That whimsical clause "Once target control is achieved and the results of self-monitoring become quite predictable" obviously comes from the pen of an author who has never attempted to control diabetes himself, but past that point, the statement starts to make sense and comes into a kind of working partnership with reality. "When A1c targets drift off course" describes something which, in the absence of regular glucose testing, is as routine, and indeed as inevitable, as a car drifting off course when the driver has his eyes closed. I take that to be a big enough loophole for all practical purposes.

Still, I know that not everyone will read the list as carefully as I am doing; most people will interpret this simply as "glucose testing doesn't help" and leave it at that. So, on that basis, I'm still pretty unhappy with the recommendation.


Addict Rats!

Thursday, October 17, 2013


Who Moved My Cookies?

Addiction is a complicated and still-mysterious phenomenon. Classically, addiction has been seen as the effect of chemical triggering of receptors on human cells. The opiates (opium and the other narcotics produced from it) happen to trigger human endorphin receptors -- and they overstimulate those receptors, because (unlike human endorphins) they don't dissolve immediately after linking to a receptor. Overstimulation of the receptors causes us to crave more of the same, and to need bigger doses over time to get the same effect. This is a mechanism of addiction which is easy to study and has been known about for a long time.

However, there seem to be other addiction mechanisms, which don't work by introducing a foreign compound which overstimulates cell receptors. Any established habit can become self-reinforcing, because habitual behaviors give regular small jolts to the brain's reward center, and eventually we come to demand that jolt if it isn't supplied; therefore, behaviors which don't involve ingesting any chemical substances (gambling, for example) can nevertheless become "addictive". (I'm putting that in scare-quotes because there are purists who claim this sort of thing isn't true addiction, but society in general seems to have accepted this looser usage of the word.) Some of us are more susceptible to this kind of addiction than others, but seemingly anyone can develop an addiction -- and to just about anything.

Lately there have been a lot of writers on public health who say that the junk food industry has been conspiring to get us all "addicted" to their products. There has been a lot of grim talk about these master manipulators, who know exactly what they need to do to make junk-food junkies of us.

Well, of course they know exactly what they need to do: they need to provide concentrated doses of ingredients which nature causes us to crave (because, in a state of nature, they're not very abundant). Sugar, fat, and salt are the substances that work in this way. You don't have to be a Bond villain with a secret laboratory to be able to figure out this rather obvious fact about human eating habits.

Foods that combine sugar and fat are especially likely to be habit-forming, and scientists like to do studies showing just how habit-forming they are. The latest is a study conducted by a professor at Connecticut College and his students, which claims to show that a common grocery-store treat is as addictive as cocaine.

Yes, that's right -- the treat in question is the Oreo cookie.

Perhaps you're thinking that you do like Oreos, but you've never wanted them enough to trade sex for them, so maybe they're not quite as addictive as cocaine. Well, that's just your opinion, derived from nothing more substantial than human experience. The Connecticut College team are dealing in science here; they watched rats in mazes and everything, so don't you be casting doubt on their conclusions.

The experiment began by feeding hungry rats one of two snacks, depending on which side of a maze they were on. On one side, the snack was Oreos. On the other side it was rice cakes. Like humans, rats prefer Oreos to rice cakes. (Also worth noting: the rats in the study tended to pry the Oreos apart and eat the creamy filling first.) Over time, the rats learned to associate one side of the maze with rice cakes and the other with Oreos.

In the next phase, rats were allowed to wander in the maze, with no food provided. They were observed to spend more time on the Oreo-associated side of the maze.

This whole experiment was then done again, this time teaching the rats to associate one side of the maze with a saline injection and the other side of the maze with an injection of a drug (cocaine or morphine). Later, with no injections provided, the rats were found to spend more time on the drug-associated side of the maze. And -- big news here, folks -- the amount of difference in time spent on either side of the maze was no larger for the cokehead rodents than it had been for their Oreo-addicted peers. So there you have it: Oreos are just as addictive as the most addictive drugs. Science says so!

Perhaps the researchers worried that people would think they were leaping to a pretty bold conclusion there, based on nothing more than how much hopeful lingering was observed among rats in a maze. Well, they gathered more evidence: studying the brains of the rats, they found that the Oreos were activating even more neurons in the pleasure center of the brain than the drugs were! "This correlated well with our behavioral results and lends support to the hypothesis that high-fat/high-sugar foods are addictive".

Well, okay. Maybe they're right. I guess we'd better brace ourselves for a remake of "Reefer Madness", but with a different focus this time.


Grow Your Own!

Wednesday, October 16, 2013


The DIY Pancreas

Although Type 1 and Type 2 diabetes are separate medical problems with different causes, one thing they often have in common is a loss of insulin-producing capacity in the pancreas. In the case of Type 1, the loss is total or very nearly total, and is brought about by an immune-system reaction which attacks the insulin-producing beta cells in the pancreas. In the case of Type 2, the loss is partial, and the cause of it is not very well understood (although "glucotoxicity" from prolonged exposure to abnormally high glucose levels in the blood may be an important factor). Anyway, people with Type 1 would benefit greatly from getting a new pancreas, and some people with Type 2 might benefit from that as well.

Despite the theoretical advantages of having a new pancreas, not many diabetes patients get one in the form of a transplant. Pancreatic transplant surgery is difficult, expensive, risky, and has some undesirable long-term side effects. So, this kind of transplant surgery is usually given only to the kind of people who need it most: Type 1 patients whose diabetes is extremely hard to control by any other means. (Type 2 patients are almost never given a transplant.) Anyway, transplant surgery is the sort of solution that looks good on paper but is often impractical in real life.

There are reasons to think that some of the practical difficulties involved in a pancreas transplant could be avoided by using stem-cell techniques to grow small pancreas-like structures and implant them. A team of stem-cell researchers at the University of Copenhagen is beginning to have success with a procedure for growing a kind of miniature pancreas in the lab. If you want excruciatingly fine detail, it's here, but perhaps you'd prefer to read my breezy amateur summary of the thing...

Despite the naive understanding of cloning which we have all picked up from science fiction movies, the process needs to involve a lot more than just dropping a cell with the right DNA into a vat and waiting for the magic to happen. During early embryonic development, there have to be multiple cells present, and they need to be able to "communicate" with each other by exchanging signals in the form of "notch" proteins and other chemical messengers. Also, it turns out to matter how the cells are placed physically -- arranging them on a flat surface in a Petri dish doesn't work. You have to suspend the cells in a gelatin to allow for growth in three dimensions. (You also have to provide some chemical signals to the developing cells which they aren't getting because they're in a lab rather than a womb.) But if you can get all these details working right, it's possible to get a clump of "progenitor" cells to start growing.

As the cells grow, they begin to differentiate and form a larger structure.

Pretty soon it looks as if it's trying to turn into a brain, but it doesn't.

Eventually this new structure starts behaving like a pancreas -- and starts to generate pancreatic hormones!

It's necessary to approach this breakthrough with caution. First of all, the experiment involved the pancreas of a mouse; nobody knows if the same technique would work on human cells. Second, nobody knows yet how well one of these mini-pancreas structures would function if planted inside a human body -- would it regulate its insulin production based on its sensing of blood conditions, or would it simply go its own way, and produce too much or too little insulin at a given time? Third, who knows if the body would accept this implanted mini-pancreas any better than it accepts a transplanted real pancreas? If you still have to take anti-rejection drugs for the rest of your life after implanting these things, this solution leaves a big chunk of the problem unsolved.

No doubt it will be years before any kind of practical medical therapy results from this work, but I like to follow these things anyway.


What's a "Diabetic" Lifestyle?

Tuesday, October 15, 2013


Can We Level the Playing Field?

A lot of people who are newly diagnosed with Type 2 diabetes wonder what exactly is a "diabetic diet", and they wonder what exactly is the new "diabetic lifestyle" they are supposed to be adopting. Above all, they wonder how they are supposed to stick to the diabetic diet and diabetic lifestyle when nobody else in their family, or their workplace, or their circle of friends is going to be doing any such thing.

This may be one of the biggest problems faced by any newly diagnosed diabetes patient: the stark choice you are seemingly obliged to make between illness and early death (if you don't change your habits because of your diabetes) or social isolation and alienation (if you do). Most of the other people around you don't have diabetes, so there's no reason why they should make any of the changes you're expected to make. And they certainly aren't going to make those changes. So if you do make those changes, it means you will have to slink off to a dark corner and nibble your celery while everyone else enjoys their deep-dish pizza. People without diabetes can eat all the deep-dish pizza (and garlic bread, and fries, and birthday cake) they want -- and they can be as sedentary as they want, too. Only people with diabetes need to worry about such things. It's hard to be the only person at the Christmas party for whom such things have consequences. It would be a lot easier to change your habits if everyone else needed to do it, too.

But what if everyone else does need to do it?

I read an article today from Diabetologia with the epic-length title "Lifestyle factors and mortality risk in individuals with diabetes mellitus: are the associations different from those in individuals without diabetes?". And it turns out that the answer to that rhetorical question is "no". The things that are associated with reduced mortality in diabetic people (exercise, moderate drinking, fruits and vegetables) have the same association with reduced mortality in non-diabetic people. Also, the things that are associated with increased mortality in diabetic people (lack of exercise, immoderate drinking, immoderate non-drinking, sugary foods) have the same association with increased mortality in non-diabetic people.

These associations tended to be stronger in diabetic people than in non-diabetic people (that is, the good things were more helpful if you were diabetic, and the bad things more harmful), but the general trend of what's helpful and harmful was pretty consistent regardless of diabetic status.

Although I'm inclined to wave away that difference of degree as unimportant, I realize it's the major obstacle to redefining the diabetic lifestyle as simply the healthy way for humans to live. If these lifestyle choices make more of a difference for diabetes patients than for other people, then other people are going to be very inclined to think, "Diabetes patients have to do that. Maybe I should do it, but I don't have to." And we all know how eagerly people do that which is recommended but not required. (Any teacher can tell you the most frequent question they hear from their students is: "will this be on the test?").

I must confess to having some doubts about the entire methodology employed in the study, as it seems to demonstrate that pasta, of all unlikely things, is a net positive for diabetes patients. (My glucose meter begs to differ with that conclusion.) Maybe there's some oddball reason why pasta would end up being associated with positive long-term outcomes, despite its plainly negative impact in the short term. Still, the overall conclusion, that what's good for diabetes patients is also good for other people, sounds plausible to me.


We're Exceptional

Monday, October 14, 2013

Getting Back Into It

On Friday I did strength-training again after a long layoff, and found that I actually liked the way it made me feel afterward. I took it pretty easy on the weights (I didn't want to hurt myself by trying to pick up where I'd left off), and afterwards I felt that very mild muscle soreness that actually, in a strange way, feels good. I repeated the experiment yesterday, ratcheting up the weights I was using by a small amount. Again I felt the after-effect as a mild, agreeable soreness. But maybe I pushed it too much in raising the weights for the leg exercises, because I felt some tightness and pain in my legs during some of the yoga stretches I did tonight. Okay, so I didn't get it perfect. But it does feel good to be using the gym equipment again -- even if I will always think of running is my main form of exercise.

The weather has been great for running lately -- sunny, clear, comfortably warm in the afternoons, but a bit cooler and fresher at mid-day. To the extent that we get any fall colors here, we're getting them now, so it's pretty outside. I guess this is why so many people want to move here -- this isn't mid-October weather as most of the world knows it:


The New Normal

An artist named Nickolay Lamm has been making realistic illustrations of average human bodies, based on medical data collected in various countries, to give people a more realistic understanding of body shapes and sizes (he has, for example, created illustrations contrasting the proportions of Barbie dolls with those of actual teenagers). His latest project compares the bodies of Dutch, Japanese, and French men in their 30s...

...with the body of the average American man in his 30s:

Not the most flattering comparison, it turns out, when you slip him into the lineup. The American man has a body mass Index of 29, compared to 25.2 for the Dutch man, 23.7 for the Japanese man, and 25.6 for the French man. Also, the American man has a disconcerting zombie stare which is apparently not seen in other lands; however, he looks less creepy (if not less plump) in profile.

The explanation for the American exceptionalism here is unlikely to have much to do with genetics. After all, many American families were originally Dutch, French, Japanese, etc. Most of us trace our family background to places where people were (and are) thinner than we are.

What explains the difference? Seemingly lifestyle accounts for most of it. Better health-care systems in other countries are thought to account for some of the difference as well, although I'm not sure how that works (are doctors in other countries more skilled at talking their patients out of gaining weight?).

Of course other countries get a charge out of ridiculing Americans for our trend toward obesity (the British are fond of quoting a statistic of unknown provenance which claims that "one in three Americans weighs as much as the other two"). But most countries are doing their best to catch up to us in this regard, and no doubt they will before long. Once we can get to a situation where everyone everywhere eats junk food all the time, we'll eventually have a level playing field. Not that that's a good thing, but at least we Americans won't stand out so much in the lineup any more.


Acme Molecular Gadgets!

Friday, October 11, 2013

Hadn't done weights in a long while. Dreaded getting back into it. Didn't want to go. Made myself go. Ended up thinking "this isn't so bad!".



In connection with Diabetes Awareness Month (November, but for me it's kind of every month) I was asked to contribute to a survey article on the subject. All I was asked to do was state what is the NUMBER ONE thing I wish people knew about diabetes.

My reply was in relation to Type 2, not diabetes in general. I said that diabetes medications only hide the problem without addressing it; you won't start to address the problem until you change the way you are living.

It's Friday and I don't feel like staying up all night trying to elaborate on what I mean by that and why I think it's important. But I can give you an idea of my thinking with a little help from the old Chuck Jones Roadrunner cartoons.

Most research on diabetes is obsessively focused on finding the next wonder-drug which will allow people with Type 2 diabetes to live as if they'd never been diagnosed with the condition. It is a search for newer and better molecular "gadgets", which will allow us to take a shortcut around the limitations which nature is imposing on us. And so, like Wile E. Coyote, we eagerly order the next new gadget from the Acme Corporation, convinced that this one will work out much better for us than the last few dozen of them did.

Never mind that the new gadget comes heavily loaded with potential drawbacks, some of them pretty obvious. Our faith in the next new gadget is immune to skepticism. We've heard great things about it, and we've simply got to try it.

And the Acme Corporation is always coming up with something new for us to try out -- some new way of tricking nature into giving us what we want. What could go wrong? With the Acme Corporation on our side, we can place ourselves in position of superiority to nature; we can forget about nature's laws.

The only problem is that nature never seems to forget about nature's laws. Nature hasn't even heard of the Acme Corporation, and doesn't recognize that Acme has any authority over natural law.

I'm not trying to go on an anti-technology rant here; some molecular gadgets are useful and even life-saving. I certainly wouldn't want to be without antibiotics, vaccines, and anesthetics. But molecular gadgets are better at solving some problems than others. The molecular gadgets that have been developed for Type 2 diabetes do not have the kind of track record (in terms of safety and effectiveness) that would justify the Coyote-like enthusiasm with which they have been embraced. A lot of patients could do better without them than they are doing with them.


More Flu News

Thursday, October 10, 2013


The Flu Virus in Theory and Practice

Yesterday I promised to share what I had learned about the flu virus; I guess I'd better get on with it.

First of all, is a virus a living thing? There is some uncertainty about that. A virus tests the limits of any definition of "life". A virus is an extremely tiny chemical particle, about a thousand times smaller than a bacterium. It's too tiny to see in a light microscope (you have to photograph it with an electron microscope to see what it's shaped like). A virus cannot perform its own life functions by itself; it can't reproduce, or do anything else, except when it gets inside a cell of another living thing. Essentially a virus is a rogue nucleic acid that hijacks the chemical apparatus of a host cell, signalling the cell to make copies of the virus.

The virus particle consists of a nucleic acid core, with a protein coating wrapped around that, and sometimes a lipid membrane wrapped around the protein coat. The flu virus has that kind of lipid membrane. Here's how flu virus particles look in an electron microscope photograph:

And here's a more schematic view of it; the red and blue paraphernalia on the outside are part of the lipid membrane, the brown shell is the protein coat, and that coiled green snake in the core is the nucleic acid RNA (which is a lot like DNA, only single-stranded).

Bear in mind that the flu virus is extremely tiny; in the picture below it looks like a blimp, but the pinkish layer it's latching onto here is the surface of something vastly larger: a cell in the human respiratory tract.

Notice that the cell wall has various odd little structures sticking out of it; these are receptors, which are like locks which can be opened by the right chemical "key". (For example, there are receptors which open up the cell wall to a glucose molecule, provided an insulin "key" triggers it.) Human respiratory cells happen to feature sialic acid receptors on their cell walls (light blue in this picture), and the flu virus happen to feature the glycoprotein called hemagglutinin (represented as dark blue protrusions in the picture), which happens to fit the "lock" of the sialic acid receptors.

I say that the virus "happens to" match up chemically with the cell receptors, but the match-up is purposeful; that why a particular virus affects a particular type of cell. Respiratory cells are vulnerable to the flu virus specifically because they have sialic acid receptors, and the flu virus is able to use its hemagglutinin to bind to those sialic acid receptors -- and this fools the cell into unlocking its door and allowing the virus to push its way inside!

Once the virus is inside the cell, the virus particle breaks open, the RNA nucleic acid uncoils and starts taking over the cell's internal operations, forcing the cell to make more copies of the virus particle. Depending on the type of virus, it may burst the cell (destroying it) or it may emerge through the cell membrane and go off to invade another cell. The flu virus is apparently of the type which can force its way back out of the cell without destroying the cell, but that doesn't mean it does no harm to the cell; normal functioning of the cell is suspended while the cell devotes its resources to making copies of the virus. Whether a virus destroys your cells, damages them, or simply impairs their normal functioning, having a viral infection is going to make you feel pretty bad in one way or another.

It's difficult to make a drug which would kill a virus without killing the patient too. Anti-viral drugs for the flu work not by killing the virus but by inhibiting the process which allows the virus to emerge from the cell; this may not do anything to help the infected cell, but it does combat the spread of the virus while your immune system is fighting it.

Your immune system responds to a viral infection by producing antibodies to attack the viral particles; it also generates chemicals known as pyrogens which increase your body temperature. The purpose of the fever is to inhibit the chemical reactions within cells which build copies of the virus -- such reactions are optimized to work efficiently at the normal body temperature; they slow down during a fever.

Some particularly nasty viruses (such as HIV) are able to merge their genetic code into the DNA of host cells, so that when the cells reproduce, their descendant cells are "born" already programmed to produce copies of the virus. This makes the virus permanent even if it remains dormant and asymptomatic for years. Fortunately, flu doesn't work this way; once your immune system vanquishes it, you're done with it.

You're thinking "Wait a minute! If that's how it works, nobody should ever get the flu more than once. Why should I need a flu shot every year?". That's where we get to a particularly troublesome thing about viruses. Because of the peculiar way they reproduce themselves within a cell -- hijacking the chemical equipment that's already present there, and using it in a way it isn't designed to be used -- the normal quality-control processes that are involved in reproduction of genetic information does not take place for viruses. There is no error-checking. When a virus generates copies of itself, the copies can easily include mistakes. In other words, viral reproduction has a high mutation rate, so a virus tends to evolve relatively rapidly over time. New strains of the flu virus emerge every year. Your annual flu shot is a vaccine which covers multiple strains of the virus -- whichever strains the vaccine maker is able to identify soon enough to get them included in the batch. There can always be an unpleasant surprise waiting for us, if there is a big breakout of some obscure strain of the virus which did not get included in this year's vaccine.

I am still unable to discover an explanation for the claim that the nasal-spray flu vaccine is "unsafe" for diabetes patients. Anyway, the official recommendation is to get the vaccine in injected form, if you have diabetes.

And it's considered especially important for diabetes patients to get their flu shots, because diabetes patients are thought to be unusually vulnerable to the flu.

First of all, being diabetic makes you more likely to catch the flu in the first place, because diabetes can weaken the immune system; you might succumb to flu virus which a non-diabetic person would be much likelier to be able to fight off.

And once you do have the flu, being diabetic can make the illness harder to cope with. A viral infection can reduce your insulin sensitivity, so that glycemic control becomes more difficult. (Being ill also makes you inactive; you're not likely to be exercising while you have the flu, which also makes glycemic control more difficult.) The general recommendation is for diabetes patients who have the flu to check their blood sugar more frequently than usual (and very frequently if they discover that dramatic swings are occurring).

Also: drink a lot of non-caloric liquids; the flu can dehydrate you, which is yet another thing that doesn't help you with glycemic control.

If you know your health is delicate enough for a bout of the flu to be a danger to you, call the doctor immediately after flu symptoms develop; antiviral drugs may reduce the impact of the infection on you, but only if you start taking them early enough.

I've already had my flu shot for the year, and I'm hoping it will work out. And if there's going to be a strain of the flu going around which wasn't included in this year's vaccine, I hope my immune system will fight it off anyway. Exercising regularly seems to strengthen the immune system; since I started my exercise program in 2001, I've had the occasional cold but not the flu. I hope I can keep it that way!


The Noncomputer Virus

Wednesday, October 9, 2013


The Flu & You

I've been seeing articles about how people with diabetes need to take special precautions as we head into the flu season. And this has reminded me of one of the great mysteries of science writing: what is it about the subject of influenza which makes writers become hopelessly incompetent? If ever there was a health topic which is always written about badly, this is it. I have never read anything about influenza which did not drive me crazy with frustration, because the writer never answers the reader's most obvious questions.

"Diabetes Patients Should Get Flu Shots, Not Nasal Spray" says the Medscape headline. Why not the nasal spray? The article amplifies this without much clarifying it: "...those with diabetes should get vaccinated -- ideally with shots, rather than the nasal-spray flu vaccine, for safety reasons". What is unsafe about the nasal-spray version of the vaccine, and why is it unsafe specifically for people with diabetes? The linked articles don't shed much additional light:

Seeking clarity, I explored the CDC's site on the nasal flu vaccine, and encountered this statement: "CDC does not have a preference for which of the available flu vaccine options people should get this season. This includes deciding between trivalent or quadrivalent vaccine or between injection (the flu shot) or nasal spray vaccine. All are acceptable options, but some vaccines are intended for specific age groups. Talk to your doctor...". Below that, there's a list of people who shouldn't be vaccinated with the nasal spray:

  • Children younger than 2 years
  • Adults 50 years and older
  • People with a history of severe allergic reaction to any component of the vaccine or to a previous dose of any influenza vaccine
  • People with asthma
  • Children or adolescents on long-term aspirin treatment.
  • Children and adults who have chronic pulmonary, cardiovascular (except isolated hypertension), renal, hepatic, neurologic/neuromuscular, hematologic, or metabolic disorders
  • Children and adults who have immunosuppression (including immunosuppression caused by medications or by HIV)
  • Pregnant women

No mention of diabetes, but maybe that's covered by the mention of "metabolic disorders". Zero explanation of why the nasal vaccine is not recommended for these people, or what sort of consequences might ensue if the warning is disregarded.

You see what I mean? Writing about the flu somehow makes people's brains stop working, so that they are no longer communicating anything clearly to anyone and they don't care. (And by the way, writers on health issues: if you present "do this, not that" advice to the public without explaining why, you're crazy to expect people to listen.)

More of the same communication problem: many articles say that it's important to know the difference between cold and flu, but then proceed to give lists of symptoms for the two illnesses which seem awfully similar. The main difference between them seems to be that flu is "more severe", but what does that mean exactly?

The most useful differences I am able to spot by comparing these symptom lists are: flu is more likely to involve a high fever with a rapid onset, and it more likely to result in aches and pains, and it incapacitates you longer (say a week instead of two days). But all of that makes it sound to me as if diagnosis is a judgment call which a non-doctor isn't in a very good position to make.

It turns out that there's a test your doctor can give you in the examining room which provides results in a matter of minutes, and this test can distinguish between cold and flu; it can even, to some degree, indicate which type of flu virus you've got. This test result enables the doctor to prescribe an appropriate antiviral drug to help minimize the impact of the illness. But there's a catch: you have to take the test soon after symptoms appear. If you don't start taking the antiviral drugs within 48 hours of the appearance of symptoms, it's probably already too late for the drugs to do you any good. So, if you think you're especially vulnerable to the flu and you hope to suffer less the next time you come down with it, get on the phone to your doctor as soon as the dreaded symptoms appear.

I learned more about the flu and diabetes today, but I'll have to finish writing about it tomorrow.


Now You See It...

Tuesday, October 8, 2013


Why Prevention is Better

I mentioned yesterday that diabetes increases your risk of retinopathy more than it increases your risk of anything else. (Retinopathy is damage to the fine structure of blood vessels in the back of the eyeball, where images are formed and transmitted to the brain.) Incidence of retinopathy in people who have had diabetes for a decade is supposedly as high as 80%, which is pretty creepy, considering what retinopathy can do to your eyesight.

That doesn't mean 80% of people who have been diabetic for ten years are blind. They may have mild impairment of vision, or none at all. Retinopathy is detected by direct examination of the retina in the ophthalmologist's office. I have one of these retinal scans every year, regularly disappointing my eye doc, who seems determined to find evidence of retinopathy in me and so far has not. Often a retinal scan reveals retinopathy which is asymptomatic -- it has not yet begun to affect the patient's vision. But it's a good thing to know it's developing, so that you can get your blood sugar under better control and prevent the problem from progressing.

When retinopathy progresses to the point that the patient is conscious of the problem, the impact can be pretty scary. For example these two pictures are a simulation of what severe retinopathy does to your vision. The normal-eyesight view...

...is transformed by retinopathy into this:

Thus we see that retinopathy reduces the incidence of race-based discrimination. Apart from that, though, it seems to offer no advantages to society, so it seems as if we ought to be doing something about it.

Unfortunately, not a lot can be done about it after it has become severe. Once the tiny blood vessels in the eye have become damaged (blocked, leaky, misshapen -- all of that can happen), the body is not very good at restoring them to health. The body does try to address the problem, but it does so by growing more and more new blood vessels, and the retina become tangled with them.

As if that were not enough, a new study of diabetic retinopathy finds that a great many functional proteins are involved in the process of detecting light and turning it into optic nerve impulses -- and that those proteins are altered by retinopathy -- and that of those proteins, only about half can be normalized by treatment of diabetes with metformin. The rest of these retinopathy-damaged proteins tend to stay damaged. Apparently, even if your retinal blood-vessels can be repaired to some degree with laser surgery, compromised vision tends to remain compromised; treatment is aimed more at damage control than at restoration.

The old claim that an ounce of prevention is worth a pound of cure applies with a great deal of extra force to diabetes. When you're diabetic, an ounce of prevention is worth a ton of cure, because the damage that poorly-controlled diabetes can do to us is of a chaotic and complicated nature. It's about as easy to reverse the process of diabetes-driven bodily harm as it is to coax the toothpaste back into the tube.

So, let's stay focused on preventing diabetes from harming us, and not wait for medical science to find a way to undo the harm. It could be a mighty long wait.


Will You or Won't You?

Monday, October 7, 2013


A Seemingly Simple Question

"Will you have complications if you keep A1c under 7?"

This was a question somebody asked Google yesterday, and as Google referred them here, I might as well try to answer it. And it seems like such a simple question that I can see why someone might think it would have a simple answer. Which, of course, is usually a sure sign that the answer is going to be a lot more complex than the question.

Most people assume that "diabetes complications" are health problems which are caused by diabetes and never occur in non-diabetic people. It would be more accurate to think of them as health problems which (1) can be caused by the long-term effects of glucose in the bloodstream, (2) have a low rate of incidence in people with normal levels of blood glucose, and (3) have an increasingly high rate of incidence as average blood glucose levels become increasingly elevated above normal.

In other words, nobody has zero risk of these health problems, but diabetes magnifies a small risk into a large one. It's a little like the relationship between smoking and lung cancer. You can call lung cancer a "complication of smoking", but that doesn't mean lung cancer is unknown among non-smokers. In fact, it's fairly common in non-smokers; 10 to 15 percent of lung cancer cases are in people who don't smoke and never did. Obviously the risk has to be greatly increased by smoking, if smokers account for the other 85 to 90% of cases, but smoking only expands an existing risk that just comes with being a living breathing human.

So, if our question of the day were "Will you have lung cancer if you never smoke?", then the answer would have to be, "Probably not, but there's no guarantee." Being a lifelong non-smoker does not eliminate your risk entirely, but it reduces it so much that it's well worth doing. The same applies to glucose control and diabetic complications; nothing gives you zero risk, but good diabetes management gives you a greatly reduced risk.

I looked for a graph of complications risks as a function of hemoglobin A1c test results; this was the best I could find (it's based on data from the Diabetes Control and Complications Trial (DCCT).

The four colored lines on the graph show the relative risk for four different common diabetes complications: retinopathy (an eye disease which can cause blindness), nephropathy (kidney disease, which often leads to heart disease), neuropathy (nerve damage, which is painful and promotes other problems), and microalbuminuria (leakage of protein from the kidneys, which indicates both kidney disease and problems in the circulatory system generally). As you can see from the way the colored lines diverge, the increase in risk, for a given A1c level, is larger for some complications than others. At an A1c level of 12, the risk of microalbuminuria is magnified by a factor of four or five, while the risk of retinopathy is magnified by a factor of 20. (By the way, we're talking about A1c results averaged over a long period -- simply hitting one bad A1c level once in your life doesn't mean you're stuck forever with the risk factors associated with that level.)

This graph certainly has its limitations, at least for my purposes; it is scaled to include a very wide range of A1c levels and risk levels. The question I'm trying to answer here is about what happens when the A1c level is under 7. Let me zoom in on the lower left corner of the graph, where that range is covered.

Because, as I mentioned earlier, no one has zero risk of these "complications", the relative risk graph doesn't start at zero -- it starts at "1" -- which represents whatever the risk is for nondiabetic people, and is associated with an A1c level of 6 or lower. Note that the relative risk immediately starts climbing above the "1" level as A1c starts climbing above 6, and at an A1c level of 7 the risk gets pretty close to the "2" level (doubled risk), at least for retinopathy and nephropathy. That may seem like nothing, compared to the twenty-fold risk of retinopathy at an A1c level of 12. Still, if you can manage to get close to the normal risk, instead of twice the normal risk, it's obviously better to do so (especially if you can do it without drugs, so that you don't have to factor the risk of drug side-effects into your cost/benefit analysis).

Another problem with this graph is that it doesn't give us any idea of what the normal risk actually is for any of these complications; it's simply set equal to "1" and everything else is a multiplier. Absolute numbers are awfully hard to come by for this sort of thing, but an article on retinopathy which I read (and cannot now find) said (if I am remembering this right) that the normal risk of retinopathy over a ten-year period is something like 5%, that it's still under 10% if your A1c result is 7, but that it rises to 100% if your A1c is 12.

Naturally we would prefer to have zero risk of retinopathy, as it's a serious threat to eyesight. But if we're stuck with a minimum risk of something like 5%, let's at least not let it get a whole lot worse than that.

So, returning to the original question, your risk of getting diabetic complications if you keep your A1c under 7 are is vastly reduced, though not entirely eliminated. But they aren't entirely eliminated even for people who don't have diabetes, if that's any comfort!


Question Time

Thursday, October 3, 2013


More Diabetes Questions Answered

Google tells me what people are trying to learn about diabetes, and I do my best to help them out.

"why would low blood sugar kill you if your not diabetic"

Why wouldn't it? Blood sugar that is low enough to kill a diabetic person ought to have the same impact on a non-diabetic person. Brain and heart functions need energy to sustain them, and in extreme hypoglycemia the energy isn't being supplied.

"why is it when diabetics sugar is low they say stuff they don't mean"

Don't be so sure they never say stuff they don't mean when their blood sugar is normal. Maybe they're just more convincing in their normal state.

Low blood sugar impairs the functioning of the brain, and can lead people into a confused and hallucinatory state. It might make them say stuff they don't mean. It might make them say stuff they do mean, but are usually able to prevent themselves from saying. The loss of consciousness associated with hypoglycemia can be a partial, gradual development, with the subconscious still able to run the mouth. It's pretty much the same thing as talking in your sleep. And how well does sleep-talking reflect your true feelings and beliefs?

"can urine prices tend to be sticky because of high blood sugar"

Don't ask me; I wasn't even aware that there was a futures market for that stuff.

"if blood sugar isn't always high is it diabetes"

High blood sugar is diabetes by definition, even if it isn't high every hour of the day. Absolute consistency is not a requirement.

"is it possible to be constantly rundown and tired with type 2 diabetes fasting glucose 123 a1c 6.5"

It is always possible to be constantly run-down and tired, regardless of what your diabetes numbers are looking like. But if your numbers aren't very far above normal, it probably isn't diabetes specifically that is causing your fatigue. It could be a different medical problem. It could also be as simple as lack of exercise. (Paradoxically, people who never exercise are tired all the time; people who exercise regularly are not.)

"can someone be diabetic with an a1c test result of 6.0"

The usual diagnosis point for diabetes, on that test, is 6.5. But a diabetes patient who achieves good control can bring the result down to 6.0 or lower. Is the patient still "diabetic"? If we are defining diabetes as the underlying condition (which isn't curable) rather than its current state of control (which can be normalized), then yes, you will remain "diabetic" forever in the sense that the underlying problem hasn't gone away and will always be an issue for you.

"if your blood sugar is within normal limits and you are diabetic can you still sweat a lot"

It's always possible to sweat a lot. However, if you are sweating a lot more than you used to, for no apparent reason, that's a symptom to be investigated. Poorly controlled diabetes could be the cause of it, but if your blood sugar is normal then you need to seek the cause elsewhere.

"i have sugar in my urine but my blood sugar is always great no matter what"

When your kidneys are leaking sugar into your urine, the explanation is not always diabetes. If your blood sugar is normal, but your urine is sugary, you need to be checked out for possible kidney disease.

"im diabetic and on medicine but i still wake up every hour to pee"

Sounds like the medicine isn't doing enough for you; the likeliest explanation for frequent urination is very high blood sugar, but you need to verify that, and it shouldn't be hard to do. So, forget your urine -- check your blood!

"how quickly does eating something sugary show up in your urine"

It varies a lot, owing to various factors (including how much water you're drinking), which is why testing your urine for sugar is such a dangerously imprecise way of finding out how your blood sugar is doing. So, forget your urine -- check your blood!

"i'm diabetic but i don't be peeing like dat is my sugar level be high"

Your blood sugar can be elevated enough to be harmful to you even if it's not elevated enough to cause abnormally frequent urination. So, forget your urine -- check your blood!

"scariest thing about type 2 diabetes"

Its harmful impact on grammar (see previous question).

Making the Comparison

Wednesday, October 2, 2013


Exercise versus Drugs

I have complained pretty often that, when researchers try to evaluate the effectiveness of a medication, they usually compare it to some other drug (or to doing nothing at all); they seldom compare a drug to exercise. The general failure of researchers to compare drugs with exercise gives the public the impression that drugs are necessarily so much more effective than exercise that the comparison is not worth making. But what if drugs are merely more profitable than exercise, not more effective? What if researchers neglect exercise as a therapy simply because nobody can get a patent on it?

BMJ (which used to stand for British Medical Journal, back in those primitive times when people made their own soap and writers used things called "words", but is just BMJ now) has tried to address the issue I've been raising. I don't know if they've done it just to make me happy, but they have just published a report called "Comparative effectiveness of exercise and drug interventions on mortality outcomes: metaepidemiological study". This is a meta-analysis, so it's a systematic survey of the research that has already been done on this subject -- and the authors complain of the difficulty of getting together enough data for a meta-analysis, when the subject is one which researchers have tended to ignore. But apparently they've found enough studies on exercise and drug interventions for the number-crunching to yield statistically significant results. It's especially interesting to me that this analysis looked specifically at mortality outcomes. A drug is often rated effective merely because it brings about a change in some number in a lab result, not because it made patients less likely to die, and the significance of a number in a lab result is always open to question. The significance of dying versus not dying is a little more cut and dried.

The authors of the report looked at various chronic health problems to see how exercise stacked up against the standard drug treatments for those conditions, and found that "In secondary analyses comparing exercise with the drug interventions pooled together, there was no definitive differences between drug and exercise interventions in coronary heart disease, heart failure, and prediabetes... Although exercise interventions were more effective than drugs in reducing the odds of mortality among patients with stroke, this finding was associated with large uncertainty in the exact estimate of the treatment effect owing to the small number of events."

Bottom line: drugs were no better than exercise for three of the diseases, and seemingly inferior to exercise in the case of stroke.

If exercise is as effective as drugs, and (unlike drugs) has desirable side-effects rather than undesirable ones... shouldn't exercise be taken just a little more seriously as therapy than it generally is?

Trust me, if the authors had wanted to ignore exercise and simply compare drugs to other drugs, they wouldn't have to complain about how hard it was to track down enough research data to make the comparison!

Maybe the situation will change. The University of Copenhagen is launching a large study (2500 participants) to investigate the possibility of curbing the worldwide diabetes epidemic by finding "the most effective combination of diet, exercise and lifestyle".


Bugs in the System

Tuesday, October 1, 2013

That's not a typo up there -- my test result today was 94 after lunch, the same as it had been before breakfast. Not as odd as it might seem; I did a hilly run, and then ate a very low-carb lunch (a salad without any starchy or sugary ingredients thrown onto it).


Bug Questions Getting Weirder, Says Search Data

In the Google statistics on what people are searching for (at least the people who end up at my site), questions about insects being attracted to the urine of diabetes patients are always a popular genre. Some recent examples:

I've addressed this general issue many times, but public curiosity on the subject is clearly insatiable. To dispense with it briefly: when blood sugar is extremely elevated, sugar leaks through the kidneys into the bladder, and the sugary urine which the patient then produces can be attractive to ants and bees -- which is how the disease was recognized in ancient times. However, we have much more reliable ways to diagnose and track diabetes now, and it would be foolish to decide you are diabetic or not diabetic based on observed insect behavior. If insects seem to be gathering around every time you reach for your zipper, by all means have your blood sugar checked -- but don't assume there's no problem if you have not noticed any bugs stalking you. Insects could have their own reasons for lapping up non-sugary urine, or for scorning the sugary kind. It's never easy to be sure what an ant is thinking.

Anyway, that takes care of the more routine questions in this area, but lately I've been getting some that are a lot more original.

For example: "what kind of sickness that cause killing ant by urine". Presumably there has been some miscommunication here. I've never heard of a disease causing anyone's urine to be fatal to ants (although arsenic poisoning might possibly qualify). If there is such a disease, it might be an interesting one to have, at least during whatever period you are able to survive having a disease which makes your urine so toxic you can use it as an insecticide.

And there's this: "what will happen when bees attack diabetic human". Well, I'm guessing that what would happen when bees attack a diabetic human would be the same thing that happens when bees attack a non-diabetic human: the human receives a bunch of painful stings, and might go into shock if they have an allergy to bee venom. Bees aren't vampires; they don't bite you to get at your sugary blood; they just sting you. And bees only get to sting you once (the process causes fatal damage to the bee's abdomen), so it scarcely matters to the bee whether it is inserting its stinger into a diabetic or a nondiabetic person.

Bonus information: a worker bee stings only as a kind of suicide mission, in defense of a nearby hive -- the farther it is from the hive, the less likely it is to sting even if provoked. I used to think it made no sense (from an evolutionary perspective) for a bee to undertake a suicide mission, since that means its genes won't be passed on to another generation. But I was overlooking a crucial detail: the worker bees who die defending the hive are not going to reproduce anyway. Worker bees can pass on their genes only by defending the hive within which their reproductive relatives live. Because of the bizarre chromosome structure of social insects, the worker bees have more genes in common with offspring of the queen bee than they could possibly pass on if they were reproductive themselves. This oddity is what accounts for the strangely "selfless" behavior patterns of worker bees, soldier ants, and the like: the behavioral incentives are different for them.

Well, you knew the behavioral incentives had to be different for them, if their behavior included lapping up your urine. I mean, doing that has got to take a whole different world-view.


Reality Check

"The cheek of every American must tingle with shame as he reads the silly, flat and dishwatery utterances of the man who has to be pointed out to intelligent foreigners as the President of the United States." That was the Chicago Times, a paper favoring the Democratic Party, giving a bad review to a speech by a Republican president. The year was 1863, and the speech was Abraham Lincoln's Gettysburg Address.

I offer this historical reminiscence because a lot of Americans (myself included) are in the habit of assuming that journalism and public discourse in our country used to be far less disfigured by political partisanship than they are today. Perhaps pondering this will make it easier to tolerate the current nonsense going on in Washington.

On the other hand, it's going to be mighty hard for me to tolerate this:

I've been getting a lot of telemarketing calls lately; I think I have fallen off the Do Not Call list, and need to renew. I'm not sure how long I can put this issue on hold. I was ready to shrug off the reports about the cancer treatment trials being canceled, but telemarketing is a serious problem!


Older Posts:

September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
August 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
September 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
Jan/Feb 2008