Tuesday, July 2, 2024

Artificial stars?

Community in House, M.D. [Media Notes 135 B]

When I wrote yesterday’s post about House, M.D. I figured I might have more to say about it, but I didn’t have any explicit plans to do so. When I woke up this morning, I had an idea or two. These ideas center around community.

In this case we’re dealing with community on two levels. On the one level there’s the local community centered on our protagonist, Dr. Gregory House. This community includes his boss, Dr, Lisa Cuddy, his team of associates, Dr. Eric Foreman, Dr. Robert Chase, Dr. Allison Cameron, and his best friend, Dr. James Wilson. This local group then “shades out” through the hospital more generally. On another level there’s the community at large. This is where the patients come from. The hospital serves that community.

Illness and injury threaten that larger community. When a person is ill, they cannot participate fully in the community. When a person dies, the community is diminished. The hospital community exists to serve that larger community. The House-centered community operates within the hospital to serve that larger community.

The irony of the show is that at the center of this web of communal interactions we have an unpleasant misanthrope. House isn’t interested in the community. He’s interested in disease. What is the role of disease in the House community, if you will?

It structures their interactions, gives them purpose. In particular, differential diagnosis seems to be the central organizing activity in that local community. They meet, discuss and argue, and then go about their various tasks, usually assigned by House, who then may hang out in the office, while continuing to think, listen to music, watch TV, whatever. Those tasks will mostly take place elsewhere in hospital, though House’s associates will regularly venture outside the hospital to investigate patients’ homes and/or places of work.

The activity of differential diagnosis itself takes place through talk and writing on a white board. The talk is highly technical. I assume that it’s more or less technically correct, but I don’t know enough about medicine to judge that and I assume that’s true for most of the audience. What’s important is that we see the activity. We see investigation taking place. We experience diagnosis as a communal activity. How does this little community maintain itself in face of disease and under the pressure of the often at odds personalities of these doctors? That’s what we’re seeing. That’s what the show is about.

And then there’s serendipity. The process of differential diagnosis is driven by evidence, a patient’s symptoms and history, and by medical knowledge. And then there’s serendipity. Something accidental happens, someone notices something that might be totally unrelated to the case, but it sparks a train of thought the becomes relevant. Serendipity takes as to the edge of community, if not beyond it.

The upshot is that we see the social construction of truth, a construction that is tested in the process of treatment. The detective show does the same thing, but with a different set of procedures. We know that House was inspired in part by Conan Doyle’s Sherlock Holms and we know that the character of Holms was based, in part, on that of a surgeon, Joseph Bell. So we’ve got a century’s worth of narrative history trailing behind House, M.D., with each title having its particular mix of ingredients. It’s my impression that most of the titles don’t focus on the process of reasoning as closely as House, M.D. does. But this is not the place to even begin to sketch out the coordinates of that space.

Diagramming sentences in the 19th century

I loved diagramming sentences in grade school. I'm sure I did it in 6th grade, but possibly 5th grade as well. Over a Language Log Victor Mair has a post on the subject, Diagramming: history of the visualization of grammar in the 19th century, which links to a review article by Hunter Dukes that contains photographic copies of seven archival texts: American Grammar: Diagraming Sentences in the 19th Century. Mair's article has a number of interesting quotations from Dukes' text.

Monday, July 1, 2024

The Presidential debate as entertainment

I've been watching Zeth's videos for over a year, maybe even two. When I first started watching them it was just Zeth and his daughter, Saylor. Then his with and son joined in, making them family videos. I'm putting this one up because they talk about the Presidential debate starting at 13:30. They don't see it as a serious political event. They see it as entertainment and mostly talk about Biden's poor condition.

Timestamps:

0:00 Visiting California
9:22 Zeth's Hair Problems
13:30 Presidential Debate
18:50 The New Pod
20:56 The Olympics
23:37 France
25:10 Florida Beaches
28:30 Atlas Joins

Flowers

What’s up with House, M.D.? [Media Notes 135]

House, M.D. has been on my to-do list for a couple of years. As you may know, it’s a medical drama than ran on Fox for eight seasons between 2004 and 2012. I watched it back then, and I watched in online since then. I’m now in the second season of rewatching it again.

I’ve got a simple question: What’s it about? Oh, I know, it’s about Dr. Gregory House who’s a world-class diagnostician at the fictional Princeton-Plainsboro Teaching Hospital in New Jersey. What makes it interesting is that House, while a very good diagnostician, is an unpleasant person. He’s rude, self-centered, difficult to get along with, and insists on doing things his way, which often puts him at odds both with accepted medical practice and with Princeton-Plainsboro. Nor is he particularly endearing with his patients. He’s also in chronic pain and, consequently, addicted to pain medication. Each episode is built around a single case, though there may be one or three simpler cases on the side.

Given such a thoroughly unpleasant central character, why was the series popular enough to run for eight seasons a get a bunch of awards? To be clear, I’m not questioning the merits of the show, after all, I’m now on my third time through. No, what I’m curious about is how it is that such a popular show could be built around such a thoroughly unlikable character?

Obviously, there wouldn’t be any show if House were incompetent. He had to have some redeeming quality, medical brilliance in this case. We’ve had medical dramas on TV since the fifties, and I’ve watched a bunch of them. I don’t recall any others where the protagonist was so unlikable and also of such a high level of competence. Those two things go together. Why?

Imagine another drama about a physician with House’s skills but who was more likeable, perhaps married as well, with a beautiful wife and cute children. That would be a very different show, and wouldn’t work. (Or would it?) We also need to emphasize that House is a fairly technical show in it features a lot of complex medical terms, some technology as well, and interior views of the body via special effects. There’s also the activity of differential diagnosis, which is on display 3, 4, 5, or more times during the show. That’s critical to the “texture” of the show. Off hand, the only precedent I can think of is CSI: Crime Scene Investigation, a forensics crime show that ran from 2000 to 2015, was a huge hit, and spawned a spin-off and several imitators. It also featured technical terms, apparatus, procedures, reasoning, and special effects.

I’m thinking that, as American culture is fairly anti-intellectual, ambiguous at best, making the protagonist unpleasant and in pain is a way of taking the edge off his brilliance – Yeah, he’s super-smart, but he’s also an unhappy asshole! I’m not sure I believe that, though it’s been on my mind for a while. It probably needs a better formulation, but I don’t know how to do it, and I fear that it would likely require a bit of apparatus-building.

One final remark, while the unpleasant and tortured genius doctor is at the center of the show, there are other important figures as well: his (female) boss, the Dean of Medicine, his best (and only) buddy (an oncologist), and his three associates, who changed as the show went on. None of these people are unpleasant in the way House is. Getting a good mix of associated characters is certainly part of the formula, but that’s true for all TV shows, movies, dramas, or narratives. What are the specific requirements of “compensating” for a character like Dr. Gregory House?

* * * * *

Given the prominence of AI these days, I assume someone’s working on a show built around an AI doctor. I note that Star Trek: Voyager has already done it, with its holographic doctor. But when that show aired, 1995-2001, AI was purely conjectural. That’s no longer the case. And that changes everything, no?

Bang that drum! [What fun]

Scalable LLMs without Matrix Multiplication

Rui-Jie Zhu, Yu Zhang, Ethan Sifferman, et al., Scalable MatMul-free Language Modeling, arXiv:2406.02528v5 [cs.CL]

Abstract: Matrix multiplication (MatMul) typically dominates the overall computational cost of large language models (LLMs). This cost only grows as LLMs scale to larger embedding dimensions and context lengths. In this work, we show that MatMul operations can be completely eliminated from LLMs while maintaining strong performance at billion-parameter scales. Our experiments show that our proposed MatMul-free models achieve performance on-par with state-of-the-art Transformers that require far more memory during inference at a scale up to at least 2.7B parameters. We investigate the scaling laws and find that the performance gap between our MatMul-free models and full precision Transformers narrows as the model size increases. We also provide a GPU-efficient implementation of this model which reduces memory usage by up to 61% over an unoptimized baseline during training. By utilizing an optimized kernel during inference, our model's memory consumption can be reduced by more than 10x compared to unoptimized models. To properly quantify the efficiency of our architecture, we build a custom hardware solution on an FPGA which exploits lightweight operations beyond what GPUs are capable of. We processed billion-parameter scale models at 13W beyond human readable throughput, moving LLMs closer to brain-like efficiency. This work not only shows how far LLMs can be stripped back while still performing effectively, but also points at the types of operations future accelerators should be optimized for in processing the next generation of lightweight LLMs. Our code implementation is available at this https URL.