Friday, July 31, 2009

A compromise solution for the management of my blood thinning medication has been agreed to between my psychiatrist and my electrophysiologist. After discharge, I will move into a retirement home in Burlingame, where they will make sure I keep up on the blood thinners. I'm 90% sure I would be fine managing my own meds as I was for months after my heart attack and since my open heart surgery, The 10% uncertainty is due to the fact that I just suffered through a major depressive episode. As I related in the last blog entry, in the last week of that episode my dosing became erratic. I would forget to take the meds and miss doses. I also varied the times I took them because my sleep was disrupted. The risk of dropping below therapeutic levels and suffering a stroke if I have a recurrence of the depression is too much for my surgeon, and for me.

The retirement home is an expensive option. It would last for 3 months. I'm considering whether or not to move out of my apartment. I'd been thinking about that anyway. It would help me bear the additional expense. I'd be prepared for a fresh start after the three months, which I very much need in any case.

My physical/psychological situation has complicated plans for my care in another way. I'm going to enter an outpatient psychiatric program after leaving the hospital. But before that happens, it may be that my psychiatrist would like me on the locked ward for evaluation. That usually takes three days. I've been down there. It's drab and boring and filled with miserable, suffering souls. But if that's what I need to do, then so be it. I'd like to get that over with as soon as possible, but they won't take me until my medical situation is resolved. So I will sit here in the hospital spinning my wheels over the next several days, when I theoretically could be "serving my time" downstairs. On the other hand, it may be that they will be OK with me going directly to the outpatient program. I hope that's the case.

This experience has been an eye opener for me on the relationship between psychiatric and general medicine. I've also gotten a look at how dentistry is treated by "regular" doctors. These topics are ripe for a blog post. I have time to kill, so I may write such a post over the weekend. :)

Wednesday, July 29, 2009

I had open heart surgery in January of this year. The surgery was to bypass blocked coronary arteries (CAB) and to remove scar tissue left from my heart attack (LVA). Another component was a modified maze procedure, to try and address my atrial fibrillation and atrial flutter. The first two procedures were a resounding success. My ejection fraction is 55, which is essentially normal. This was an improvement from an EF of 30 before the surgery. But the maze procedure completely failed to address my arrhythmias. Since June, I've been suffering increasing bouts of arrhythmia combined with tachycardia.

Since May, I've been experiencing an episode of major depression. Over the last ten days, I started skipping medication doses due to forgetfulness. I also varied the times I took the medications because my sleep patterns were disrupted.

To understate the case, these two illnesses are interacting in an unfortunate way. It's not just that my depression interferes with my medication schedule. Because of debilitating symptoms (shortness of breath, exhaustion), the arrhythmia makes the depression worse. So that's a tidy little vicious circle.

Last Monday morning, about 2AM, the arrhythmia got worse. My heart was racing at about 140 beats per minute, and I became concerned about it. I took a cab up to the local ER, where I presented myself as suffering from arrhythmia and depression. After examining me, the ER doctor made the decision to admit me into the intensive care unit for the former condition.

I spent two nights in ICU before moving to over to the Telemetry Care Unit (TCU.) I'm still sitting there as I write this. My electrophysiologist has a wealth of new data on my heart condition. That in turn has allowed her to clarify the choices I am faced with. These break down into two main choices. First, we could continue "medical management" of my condition, meaning we could try to control the arrhythmia with drugs. The drug of choice for in my case would be amiodarone. I went on this drug after my heart attack, and it controlled the arrhythmias. But then it gradually stopped working. Higher doses are possible, and we are trying those to see if we can't get the a-fib/flutter under control. The drawback of amiodarone is that it has toxic effects on the liver. Since I'm only 53, those effects would be more likely to show up if I went on the medication for the long term. Amiodarone also interacts with just about every other medication on Earth, limiting clinical choices when dealing with other conditions. Also, there's no guarantee that amiodarone will work at a higher dose.

The second option is ablation surgery. This is a technically very interesting procedure done by passing catheters up the femoral artery. Two have sensing electrodes and one an RF generator. The surgeon attempts to induce the arrhythmia, then measures electrical conductivity on the atrium looking for the rogue circuit. Once found, the RF generator is used to scar the atrium in such a way as to disrupt the circuit. Then the surgeon tries to induce the arrhythmia again and the procedure is repeated as often as needed. There are two factors that complicate this choice in my case. First, I have atrial fibrillation and atrial flutter. That means that the circuits involved are more complex, and so correspondingly more difficult to fix. The second factor is that it's likely that I have arrhythmia arising from the left atrium. To get there, the surgeon has to drill through the wall separating the two atria. This lengthens the time required for the surgery. Most critically however. It raises the chances of a stroke to 1%.

And that's where my depression comes in. If I don't keep up with my blood thinning drug, and maintain my INR within therapeutic range, the risk of stroke with a left side ablation rises to between 3% and 7%. As my electrophysiologist says, a stroke would "ruin everything" for me. So I need to be sure nothing will interfere with my dosing and testing schedule. But the depression has recently caused me to miss doses, so my electrophysiologist is insisting the depression either be resolved, or that I enter into a living situation where someone can ensure I take my meds on schedule.

Thursday, July 23, 2009

As a response to a political appeal from the White House, I described my recent experiences with the health care system. I reproduce my letter here.

I had a terrible heart attack in Jan 2008. Afterward, I fell apart psychologically. I lost my job, my disability income and my health insurance due to my own inaction. By the time I had collected myself enough to try to get these things back, I had a long way to trudge through the system to get access to health care services I desperately needed.

Getting my job back was easy. Merely getting in touch with my company did the trick. Getting health insurance and disability benefits turned out to be a lot more difficult. The medical carrier refused to take me back until open enrollment. This was six months away at the time I got in touch with them. It turns out they reversed themselves later, but as a result of the first decision, I found myself dependent on the County of San Mateo for my medical care.

San Mateo has pretty good coverage for medically indigent people. But lack of funds means that the doctors and staff are faced with huge workloads. For example, there are two cardiologists on staff for the entire county. These two doctors are both excellent physicians. The care I received from them was very good. But getting to it was difficult. The county pays classified staff poorly. As a result, people with a lot on the ball tend to move on to greener pastures. Those left try hard, but the combination of high workload and high turnover means there are many deficiencies in services supporting the medical work.

It took me many weeks to get to the point with the process at the county where I could have an angiogram done. This was performed under contract at a non-county facility, since the county medical center lacked the equipment. In fact, it was at the hospital I had been brought to when I had my heart attack. The result of the angiogram showed signs of ischemia. The examining doctor recommended bypass surgery. Partly due to my own missteps, but also because my insurance company wouldn't take me back, I had been living with a very dangerous heart condition, and with a difficult path to get to needed care.

This was in January of 2009, one year after my heart attack. I had applied to get my health insurance back the month before. Just before I was to have my surgery, the private insurance kicked in. My surgery, scheduled through the county system, was to be at the same hospital I had the angiogram at. The surgery was to take place on a Saturday. That week, I went in for pre-surgical orientation on Wednesday. They were letting me know things such as the fact that my hands would be tied when I woke up from the anesthesia. (This so I wouldn't try to remove the breathing tube.) Half way through this process, one of the nurses who was delivering the orientation got a phone call from the HMO. That hospital wasn't under contract to them. They refused to pay for the surgery if I had it there.

The hospital I was sitting in when I got the news was the one to which I had been taken the year before, when I had my heart attack. I was later told by the doctor that saved my life that I had a 25% chance of survival when I was wheeled into the ER. Watching this guy bounce around like he was on springs, leading a team of people who were all trying hard to keep me breathing was immensely reassuring at the time. I felt huge gratitude to these doctors, nurses and staff who had cared for me so well. What's more, I trusted them and the hospital they worked at implicitly with the surgery I thought I was about to undergo. The change in plans was huge shock.

I turns out I retained the same surgical team at the new hospital. In fact, the new hospital was their home base. This meant the delay was only for a week, the time it took to schedule the operating rooms. Though I kept my doctors, I have a strong impression that the change would have been forced by the HMO whether or not I could have retained the same team. If a new surgical team had had to evaluate my history and condition, the delay could have been much longer. As it was, an extra week's wait did mean an additional risk of something going badly wrong with my heart before I could get the surgery I needed.

That's my story. I'm still on disability recovering from the bypass. I have an ablation surgery coming up. It will be at my preferred hospital, since in the meantime, the HMO has been switched to the one associated with that hospital. This makes me pinch myself a bit to make sure I'm not dreaming. I want to be sure it's not one of those dreams where events seem almost real when you are asleep, but which reveal themselves to be complete nonsense when you open your eyes. I only wish this were nonsense and not the cold reality I actually experienced.

Wednesday, July 1, 2009

I love this laptop. But like love between humans, my attachment to my Mac is not perfect, and comes with reservations. Firstly, computer software sucks generally. The Mac is only a partial exception to this general rule. Second, I want all information to be free. I know this is childish and unrealistic, but that knowledge doesn't help. I am irrationally committed to Free Software. So the fact that the Mac is proprietary occasionally makes me cringe.

Here's a portion of an email I sent to a friend that works at Apple that sums up my feelings about my new machine:

You guys keep churning out great stuff. I bought a MacBook Pro the beginning of March, and I have no hesitation in saying it's the best damned laptop I've ever owned, and I've had *lots* of laptops. The thoughtful physical design, ergonomics, all that is fine. But it's the software that really rocks. Coming from Linux, I occasionally get annoyed by proprietary road blocks or toll roads. But it's nothing compared with the pit of impotent despair that I sink into while wrestling with the execrable, wasteful, irredeemable pile of stinking rotting fish that is Windows. OS/X makes towing the proprietary line seem OK. Windows brings out the insurrectionist in me. Depending on your political point of view, you could say that OS/X is Obama and Windows is Bush. They both represent the same system, but differences in style and substance make one a whole lot more tolerable than the other.

Another prediction of Linux World Domination(tm) has appeared on Geek dot com. Not atypically for this sort of thing, it's coupled with another prediction of the demise of Microsoft and proprietary software in general. Is it likely that we have actually heard a "death knell" for the proprietary approach? To provide a partial answer to that question, we can look at the successes and failures Linux has had on the road to world hegemony.

First, it's helpful to remember that we are dealing with at least three separate markets when it comes to Linux vs the proprietary competition. Those would be the server, embedded and desktop markets. Taking them in reverse order, please allow me to pontificate on the relative strengths and weaknesses of Linux in these market segments.Desktop

Desktop dominance has hovered like a shimmering mirage in front of Linux enthusiasts, and some Linux companies for years. The massive failure of Vista gave people lots of hope that the day of desktop dominance had finally arrived. But the fact is that Linux gained a minuscule amount of share in this market over the years since Vista's release. Apple benefited more, but Macintosh is still below 10% share. Why is that?

Dirty Tricks

Microsoft has been infamous for using illegal methods in pursuit of its businesses. I followed the antitrust trial back in the 90s, and read about all the dirty tricks MS pulled to fend off the twin platform threats of Netscape and Java. The company seemed willing to go to any lengths to crush competitors. But Microsoft is unable to be quite so bare-knuckled these days. The antitrust trial illuminated a lot of Microsoft misbehavior. And, although the US DOJ under George W. Bush backed off on tough sanctions that might have been effective in modifying Microsoft's behavior, the European Union later stepped up and actually enforced their antitrust laws. This provided more evidence of Microsoft wrongdoing, and came with sanctions and restrictions that actually had some effect on the company. Bill Gates' reduced role in the company may have led to less over the top behavior by Microsoft too. But despite having to tone down some of the excesses of the past, the company still ferociously defends markets in which it is entrenched, and remains a potent threat to any competitor trying to play in one of them. However dirty tricks are less likely to be the only factor driving Microsoft's success today. What else could account for that?

Hardware

Though the hardware driver picture has improved considerably on Linux, Microsoft's market dominance still means that hardware vendors are more likely to deliver ready to run drivers for their products on Windows first. Your latest PC may have trouble in that department. My not-so-new MSI Wind U100 netbook runs Ubuntu 9.04 Notebook Remix. It's gorgeous, but the wireless won't do WPA2 authentication. This means that until Ubuntu fixes the bug, this cool little netbook will work fine in the coffee shop or the airport, but not at home or in the office. You can obtain, patch, build and install a kernel module to fix the problem yourself, but if you think about it, you'll see that's irrelevant to this discussion.

Applications

Perhaps the weight of Microsoft's installed base, what the DOJ called the "applications barrier to entry," is the reason Linux can't seem to gain a lot of traction on the desktop. It's certainly true that Microsoft enjoys enormous leverage with software vendors, due to the massive market their platform provides. This leverage makes it more difficult to defect from Windows on your home PC. Your proprietary apps may work on Linux through some combination of wine and virtualization, but try to get support for those solutions, in the general case. What I mean by "support" is not just help when it breaks, but smooth and easy installation and initial configuration. And though high quality native equivalents to important commercial applications exist, few can boast the installed base, and the concomitant support from vendors and community resources that popular commercial apps enjoy. On business desktops, it may be more feasible to deploy Linux with a limited set of applications, either virtual Windows ones or native. But the IT staff is still faced with a relatively more difficult job supporting those apps given less vendor support.

New User Friendliness

Then there's the difference in ease of learning you often see between proprietary and F/OSS. A specific example may show better what I mean. Take Photoshop and Gimp. Go to http://www.adobe.com/support/photoshop/ and compare the new user documentation to http://gimp.org/docs/. Pretend you have never used either app and try to figure out how to get started. Hint, you'll find the info at Gimp's site, but you'll have to dig deeper, and you won't get the same quality for the new user. And this difference extends to third party support for the applications as well. Do a book search on Amazon, first for "photoshop" then "gimp," and count the number of results. Go to Lynda.com and check out the Photoshop video tutorials. Try to find anything 1/2 as good for Gimp there or on YouTube.

This focus on newbies goes to the heart of the Windows platform advantage on the desktop. New users of Linux tend to be significantly disadvantaged compared to their counterparts in Windows as far as learning new applications goes. My feeling is that's so because the folks that develop the software, as opposed to the people who integrate the software into a distribution, tend to lack a new user's perspective. They produce software that is easy to use once you get to know it well. Since Linux desktop domination requires coaxing lots of Windows users onto an unfamiliar platform, this deficit in hand-holding newbies bites hard. Linux distributions struggle to provide the new user with a consistent and usable environment for desktop computing. New users of Linux are likely to perceive the difficulty in picking up a new (insert user's critically important app here) as the whole story on Linux as a platform. On the other hand, it's not like Windows as a platform offers a whole lot of help in the integration department either. But the apps tend to be designed by teams that include people who want to suck new users into using them. With apologies to the minority of application projects that have worked hard to design in discoverability, and who have provided outstanding, lucid and accessible documentation with the naive user in mind, all that is just not a priority in most open source application development.

Which is it then?

You can argue about how much of Window's advantage is due to inertia and market size, and how much is due to apps that are relatively easy to learn, but there's an experiment underway that can help answer that question. MacOS X is a Unix based OS famed for usability. Applications on MacOS X are often easier to use and learn compared to their Windows equivalents On the other hand, Macintosh suffers from a similar disadvantage in critical mass that Linux does with respect to hardware and software. Macintosh enjoys about a 10% share of the desktop market, whereas Linux is around 1%. Since the two platforms face similar (though not identical) challenges trying to overcome Microsoft's market domination, we can factor out those disadvantages when comparing the two OS in terms of market share. To a first approximation, a substantial portion of that tenfold advantage in Mac desktops over Linux must reflect the advantage usability confers.

Embedded

In the embedded market, Linux has a big advantage in cost. If you are talking about a mass market item, like a cellphone, that cost advantage is a huge factor. Also, embedded applications tend to be under tighter control than typical desktop apps. An alarm system is a special purpose device - achieving usability is straightforward. (Even so, it's remarkable how many embedded applications suck bigtime.) The basic cellphone applications, making and receiving calls, accessing voicemail, managing contacts, and so forth, are well defined and relatively uncomplicated. As you move up to more general purpose computing on a mobile phone, complexity increases, and many of the usability factors that are important in the desktop space come into play. But there's less historical MS hegemony here. Finally people seem to be more willing to accept learning a different way of navigating a new phone's interface compared to learning a new computer OS. So Linux has a clearer field in the embedded space.

Server

Finally, in the server market, Linux has done and will continue to do very well. GNU/Linux started life as a Unix clone, and Unix was and remains a server OS (MacOS X notwithstanding.) Basic server applications on Linux are more mature, and requirements for application usability are different. Engineers volunteering time for server development on the GNU/Linux platform look more like their users than their desktop cousins. Indeed, the users are often those very engineers. Linux attracts quite a good deal of free R&D for new server roles. Today it's virtualization, yesterday it was clustering. Linux led both those technologies, and maintains strong positions in both today. Tomorrow the popularity of Linux in academia, and the support of big companies like IBM and HP guarantee new trends in servers will be quickly taken up, if not actually pioneered on Linux. And businesses care about some of the the advantages F/OSS confers on computer users. Avoiding vendor lock-in and providing fundamental transparency in the infrastructure are attractive to business, along with low cost and high software quality. I'm aware there are advantages to Free Software beyond the ones I just listed, but those are the ones that tend to appeal to businesses interested in using Linux in the server room.

No Panaceas

Linux is not one thing. Diversity abounds. People contribute to it, and use it, for a huge variety of reasons. So it's natural that Linux succeeds differently in different contexts. It would be unreasonable to expect anything else. Panaceas don't actually exist in the real world. Linux is not a panacea, it's a real, live, vital OS, with it's unique set of strengths and flaws. It will continue to do well in many areas, and may do better on the desktop some day. But Linux has farther to go there than in the markets which leverage its natural strengths.

Dead Software?

If you stop thinking about Linux and F/OSS as singular answers to the problems posed by computer systems and applications software, you may be able to see that software sold for profit actually has strengths and weaknesses too. However much you may wish it weren't so, from the user standpoint, software written for money rather than love is often a better solution. Alternatives to many of the flagship proprietary applications need to provide more than high quality code. They have to be designed, packaged, documented and supported to compete with their closed source competitors. One problem is that software engineers who code for love may often not be the best people to deliver on all of those requirements. Until that, and many other things change, proprietary software will stay off life support.