IT Lab   About Frank   ·  About IT   ·  Ravenel Bridge  ·  Unbuilding Bridges  ·  Photo Essays   ·  Forgetting

A Short Biography

printer friendly copy (pdf)
(Recently the Charleston Post and Courier ran a High Profile story about my views of learning and the Internet. It captures the essence of what I am all about.)

I grew up in Greensboro, N.C. in our home that backed onto an old forest filled with ponds, creeks, an old cabin and lots of places to explore. With my friends, we explored everywhere, brought little pond creatures back and made aquariums to watch tadpoles change to frogs. I was full of curiosity and my parents gave me freedom to chase my curiosity (as long as I came home at dinner time).

I discovered the fun of engineering from my dad - probably when I was 5 or 6 years old. He taught me three wonderful lessons:

  • Always open: He never met anyone he could not learn something from
  • Anticipation: Good helpers knew what to do before their bosses knew what to do
  • Brain food: Be curious
These lessons have proved to be long term tools in my life. Realizing that everyone I met had something to teach me kept my mind open and curious - sort of viewing the world as one big open source community - long before open source was even dreamed of.

The second lesson exposed me to analytical thinking and anticipating multiple outcomes. Many weekends he would take me on short trips to repair elevators. My brother and I went together, but this was like oil and water - so we traded these weekend aventures. When I went, I was his helper. Early in my helping career, I simply got in the way and demanded to be micromanaged. He quickly pointed out to me that if he had to tell me everything to do - then he was slower than if he did everything. Being a stubborn child, I simply sat and watched, often falling to sleep. Soon I learned that watching was more boring than helping so I started to learn about helping without supervision. He impressed on me the fact that I had to understand enough of what was going on so that I could anticipate which tool he would need next. I learned that if I limited my anticipation to a single path - then I was often wrong. If I expanded my anticipation to multiple paths, then I could usually give him the tool he needed. This became a game - could I walk through an entire elevator-repair episode with a perfect next tool record? I never won this competition, but I learned a lot about motor generator sets, AC drive motors, relay sequences, leveling devices etc.

The art of accurate anticipation was predicated on my understanding the problem we were solving. I discovered that often I did not know what I did not know so it became necessary to develop skill in overtly identifying gaps in my knowledge. I also realized that if I could not articulate the problem, then I could not anticipate next tool needs. If I could articulate the problem, and then asked a few questions to fill gaps in my understanding, then I could accurately anticipate. This skill proved essential for everything I did - whether signal processing, medical databases, software engineering, drug-ion channel interactions or arrhythmogenesis. It was predicated on a lively curiosity, and a willingness to expose my ignorance and his willingness to reward my curiosity and not punish my lack of knowledge. Now this skill continues to be an important part of my life and is an essential skill for exploring my current chase of Internet-centric learning and how to avoid the forgetting curve.

My formal education was through the Greensboro public school system. My geometry teacher, Mrs. Burnside, further improved my skill in analytical thinking and anticipation. Mr. Johnson (Jabbo) was my physics teacher and was an expert in igniting curiosity. These times, where these teachers made education fun, gave me a hunger for problem solving and new learning. My great teachers were balanced by some bad experiences. The head librarian, where I worked part time, told me on several ocassions that I would amount to nothing because I tried to turn everything into something fun. Looking back, this comment forced me to critically assess whether she was correct - or whether I had something special. Jabbo convinced me I had something special.

I continued my education in electrical engineering at Duke. I received the B.S. and M.S. degree in Electrical Engineering from Duke University in 1963 and 1965 and my master's thesis was about current distribution in the heart associated with external stimulation (defibrillation and inducing fibrillation). I received my Ph.D. degree in Biomedical Engineering and Biomathematics from the University of North Carolina in 1968 (I was the first graduate of the UNC program). During the last weeks before my final oral presentation, Jim Grizzle and I worked out the Categorical General Linear Model - possibly the most important work I did (other than my spider photos). I came to the Eugene Stead's Department of Medicine as an Associate in Biomathematics in 1966 and became an Assistant Professor of Medicine (computer science) in 1968. I was one of the three founding members of the Duke Computer Science department in 1971 and was promoted to Associate Professor of Computer Science. In 1978 I was promoted to Professor of Computer Science and Associate Professor of Medicine. In 1990, Joe Greenfield managed to promote me to Professor of Experimental Medicine in the Division of Cardiology. This joint relationship between medicine and computer science has been a continuing source of surprises (some even pleasant) and has provided a foundation for attacking important problems in both the clinical and basic science arenas. From 1993-1994, I was visiting professor of Biomedical Engineering at the Indian Institute of Technology, and paid as an Indian national (Rs 8500/month). Living as an Indian for a year was an unforgettable experience. In 1997, I retired from Duke and spend the fall as Fulbright Scholar at the University of Patras. Then in January 1998, I joined the faculty at the Medical University of South Carolina as Associate Provost for Information Technology and Professor of Biometry/Epidemiology and Professor of Medicine (Cardiology).

My curiosities

One can only be impressed with differences between elements used to build physical computational systems and those used to build biological computational systems. Physical systems communicate with speeds approaching that of light (.3 meter/nanosecond) while biological systems communicate at a snail's pace: 10 meters/sec. Physical systems are built from switching elements that enjoy a high degree of noise immunity while biological systems are composed of switching elements that frequently fail and have a poor signal to noise ratio. For me, it is interesting to probe biological systems in order to develop an understanding of how biological systems can compute, recognize patterns and "think" with such "poor" computing equipment, yet perform these computations at speeds that shame "physical" computers.

With these physical components, biological systems operate efficiently, robustly and perform a wide range of functions. In the brain, three of these functions are learning concepts and facts (memorizing), remembering and thinking. Learning is the process of absorbing information and forming patterns in the brain. Remembering is the process of recalling these patterns and creating mental images of what we remember. Thinking consists of taking these structures and creating new structures reflecting something "new", a new concept or a new fact. Today, computer memory is much more reliable then human memory while computer thinking is worse. So its interesting to consider a new educational paradigm where we shift the emphasis from memorizing-remembering to thinking, relying on search engines and the internet for the memory as opposed to traditional learning, i.e. content mastery via the memorization route. The brain is highly complex and only now are we beginning to understand the basics of memory. Understanding thinking seems to be quite a way in the future. But both are based on the interaction between cells. Because the heart has a simplier cellular organization, I started my research career there - exploring excitability of cardiac cells, propagation and disturbances of excitability and propagation associated with ion channel blockade, ischemia and structural complexities.

My research interests focus on two broad areas: 1: education and the two neural components: learning and thinking; and 2: computational biology with a particular emphasis on cardiac electrophysiology. My educational ideas, specifically that of evolving an internet-centric learning and working paradigm is described below. The electrophysiology emphasis is described here.

Cardiac cells are arranged in a more or less orderly manner and the nature of "cardiac computation" is rather less complex than that associated with neuronal tissue. Consequently cardiac "computation" hopefully reflects behavior of simpler and perhaps generic mechanisms in comparison with neuronal networks and will be easier to understand. My assumption is that if I can understand the nature of "cardiac computation", i.e. initiation and control of cardiac electrical excitation, then I can approach problems in more complex "nets" of excitable cells, i.e. neural nets. This has been adequately demonstrated by observing that properties of the biologically unrealistic FitzHugh-Nagumo excitable cell reveals many generic properties that cannot be suppressed with more complex and more realistic models of excitable cells. These generic properties include: threshold of excitability, stable and unstable (spontaneous oscillation) behavior, a refractory period, wave motion and vulnerability.

Work at Duke, 1965-1997: Computational Biology: Cardiac Electrophysiology and Teaching in Disguise

As an engineer in a Gene Stead's clinical department, I had free run of the Dept of Medicine, with, of course, Gene's nudging me in a variety of interesting directions. One of the most innovative pushes was to encourage me to continue my relationship with Jim Grizzle at the UNC Dept of Biostatistics.

From 1968 until 1980, I designed and implemented an interactive, multiuser computer system and database for monitoring physiologic parameters and tracking patients with coronary artery disease. As part of this work, I developed with J.E. Grizzle and G.G. Koch at UNC, the first non-iterative estimation procedure for categorical linear models which became the foundation for testing many biomedical hypotheses involving discrete (in contrast to continuous) data. Today, the cardiology database is the world's largest repository of data on patients with coronary artery disease and is the basis of many research projects focused on clinical decision support systems and cost/benefit analysis. Also the paper with Jim is one of the most cited papers in biostatistics - establishing new chapters in Statistics textbooks on analysis of categorical data models. At MUSC I began to explore ways to adapt the concepts learned with the Cardiac Database to an institutional setting and realized that the move toward compliance driven record keeping. By linking financial and clinical data streams, it would be possible to build a financially feasible clinical record based on the costs associated with maintaining a compliant clinical repository.

In 1980, Joe Greenfield took over as Chief of Cardiology and "suggested" that I shift my attention to cellular communication with emphasis on how drugs control the communication between excitable (either cardiac or neuronal) cells. Thus began a long and continuous relationship with Gus Grant and Harold Strauss. Together we addressed a number of problems dealing with drug-ion channel interactions and developed the guarded receptor model of drug-channel chemistry. By this time, it was clear that international collaboration would greatly facilitate our attack on important clinical problems. Members of the All Union Institute of Experimental and Clinical Cardiology in Moscow approached us about working jointly to better understand the side effects of drugs being developed in the then USSR. This was the beginning of the development of a "Laboratory without walls" that was linked by the Internet for continuous collaboration.

Our laboratory was the second (lost the race by 3 months) to characterize the single channel behavior of the cardiac sodium channel. In addition, we were the first to develop a mathematical model and statistical estimation procedure for characterizing drugs that interact with membrane ionic channels. In 1987 we extended our laboratory to include the Institute of Theoretical and Experimental Biophysics in Pushchino Russia in Prof. V. I. Krinsky's autowave laboratory. Prof. Krinsky is an authority on wavefront formation in excitable medium and together we developed a model of reentrant cardiac arrhythmias that has proved useful in evaluating drugs used to control heart rhythm disturbances in patients. As a result of our theoretical studies of ion channel blockade, we developed a clinical procedure for reversing the cardio-toxic effects of drugs that influence cardiac and neuronal cells. Many abused substances including tri-cyclic antidepressants, cocaine and synthetic opiate analgesics block cardiac and neuronal sodium channels, and our procedure has been found effective in reversing some of the cardiac consequences of drug overdose.

Currently, our laboratory is involved in cellular and tissue studies of cardiac rhythms and mechanisms that initiate rhythm disturbances. We are actively involved in developing efficient algorithms for analysis of single channel and voltage clamp data from studies of single cardiac cells, solution of non-linear parabolic partial differential equations and exploring control of wave formation in non-linear excitable media. We are actively involved in utilizing numerical studies to probe observations obtained during cellular and patient studies. Our laboratory is tightly coupled with the clinical cardiac electrophysiology service where patients provide another source of challenging research questions. A good example is our recent paper linking serveral inherited cardiac arrhythmias to a single mutation and our numerical studies of how mutant Na channels and drugs alter the cardiac vulnerable period.

1997 - 2006: A transition to the University of Patras and the Medical University of South Carolina

1997 marked the start of a new era for me. I was awarded a Fulbright Scholar teaching/research award and spent from July - January '98 as a visiting professor of Medical Physics at the University of Patras in Greece. It gave me the time to explore ideas in learning and education that I had learned while at Duke. I managed to become adopted by the community of Romanian students at the Univ. of Patras and they proved very willing participants in our educational experiments. By the end of my time, I could speak passable Greek, take underwater photographs, ride a bicycle downhill into a 50 km/hr head wind and facilitate learning with my Greek, Romanian and Bulgarian students. In September (mid stream), I retired from Duke as Professor Emeritus of Computer Science. In January, after returning from Patras, I accepted a new challenge at the Medical University of South Carolina as Vice Provost for Information Technology. Here, I tested many ideas about education as well as continue my research with Xiaobai Sun, Gus Grant and Maddy Spach and our international laboratory.

Applying ideas developed at Duke about problem solving and thinking: The MUSC IT Lab

At MUSC, I have been quite active in applying all that I had learned about learning, thinking and problem solving at Duke to the MUSC setting. I started by adapting the MUSC environment the "tool-based" problem solving paradigm that Gus and I developed for his lab. The basic idea was to follow the original UNIX paradigm of Dennis Richie, Brian Kernighan, and Ken Thompson - i.e. to identify a primitive set of "filters" that could be strung together to accomplish some goal - using the notion of a pipe. (For a great survey of the early UNIX ideas, look at the ATT Bell Lab Technical Journal vol 57, No. 6, Part 2, July-August 1978 and vol 63, No. 8, Part 2, October 1984. The IT Lab was the nucleus of our tool-based revolution - a group of 6 of the friendliest guys you'll ever meet.

The unstated idea in my strategy is that we must build an infrastructure based on commodity computing and global connectivity (internet) that would foster building internet-centric learning and problem solving paradigms. Why problem-solving? Because our economy has shifted to a service economy which is predicated on problem solving (the service that is provided). Our computing and communication infrastructure will provide for individuals, the possibility to assembly a multidimensional workspace of tools (one tool / dimension) that are matched to the problem environment. The worker/learner then transports data from a variety of resources, manipulates these data into new constructs and solves problems.

This strategy is not unlike that of thinking. Consider the brain - its capable of three activities, learning, remembering and thinking. Time spent learning and remembering, that is memorizing and recalling patterns, cannot be spent thinking, that of constructing new and interesting patterns from stored structures. With commodity computing and global connectivity, human memory becomes a liability - i.e. less reliable than a computer's memory. On the other hand, human thinking continues to be far superior to any machine-thinking models. So, our goal is to create a workspace that capitalizes on machine memory and shifts the emphasis of daily work from a dependence on human memory to a dependence on internet accessible resources. (A criticism is that how can you trust internet-accessible information? - and the answer is by the same tool we use to judge any information - that of applying critical analysis to the data. So we must start teaching analytical and critical thinking at a more widespread level). Our workspace is focused on solving problems, and because problem solving requires thinking and accessing applicable information and insights, we automatically create an environment for continuous learning. Problem solving in this environment depends on chasing one's curiosity. To emphasize this concept, I would give every member of our community of learners the following name tag: Frank: Curiosity@Work or Mary: Curiosity@Work.

To press forward with this strategy, we have already developed some useful tools for facilitating movement of data around the campus and ultimately into various databases. We have tried to be as standard's based as possible. The underlying notion is to capture data via a web browser and move it around the institution in a way that mimics paper flow. The underlying data capture device is a PDF or HTll form. From input data, we package the information into a portable format (tab delimited records, XML objects etc) and move it into a database, often, mySQL. The entire process of submitting and approving (or managing) data is referred to as a workflow manager. More recently Christopher Zorn has generalized the concept of the workflow manager to base it on and XML message switching system, jabber.

The back end of the tool box, analysis and report generation is under active development, We have some examples using XSL and LaTEK and a filter that converts reports to PDF format. Our progress can be monitored at our web site: The MUSC IT Lab. To facilitate analysis and report generation, we have developed a number of tools for importing and retrieving data. MySiteMaker is a wonderful tool for building a web frontent to a database, and reporting data back as either an HTML object, an XML object, and MS Excel object or as a text file. MyGrants is an example of this interface to our Grant's database.

Most recently we have been exploring ways to establish our tools as web services, and Christopher has taken ispell, the unix spell checking utility and established a spell check web service . The idea is to invoke a server side utility via a GET or PUT that then makes a SOAP call to the ispell utility. As implemented, the utility can be incorporated as a bookmarklet so that just about any browser-displayed text can be spell checked, independent of the application that generated the display.

2006 - Present: Duke-NUS Graduate Medical School Singapore


Exploring Problem solving, Problem-based learning and an Internet-centric workspace: Current activities

More recently we have been focusing on a 21st century learning paradigm - founded on an internet-centric workspace for learners at GMS (i.e. faculty and students and staff). The obvious strategy is to bring to the desktop a variety of proprietary and open source tools that facilitate access to information and problem solving. The not so obvious strategy is to actually develop a way of thinking that integrates the Internet and Internet resources into virtually every action we take.

When I am awake, I am either thinking, doing, learning or remembering. All of the mental energy that I have is divided between these four activities. For the first time in my life, I have realized that the Internet and Google give me the freedom to retarget mental energy - shifting it away from remembering and toward either thinking, learning or doing. It seems obvious to me that this is a no brainer - but few understand the rationale.

I have been surprised how few faculty can accurately articulate the basis of the learning curve and the forgetting curve are. I know that I can remember almost nothing unless I repeat what I want to remember. There is sufficient evidence from the neurobiology community to state that repetition is the first law of learning. Similarly, lack of repetition of learned material will lead to forgetting (as described in the above link). Since I know I am not going to remember infrequently used information, why not substitute Google and Internet resources for some of my remembering? The major objection is that Internet material is unreviewed. I never had any difficulty with this concept because I know that much reviewed material is simply wrong. What is missing is a skill in analytical or critical thinking.

The time is right to recognize what search engines and Internet connectivity bring to both the learning process and our daily work. Work, based on using our Internet memory is continuous learning - something that is a byproduct of retargeting remembering energy to learning/thinking or doing.

I assert that global connectivity provided by the internet levels the "access to information" playing field, and am confident that we can shift effort from learning/remembering to thinking in such a manner that the traditional boundary between learning and working is blurred. Rather everyone is solving a problem in a manner that gaining access to new ideas and insights is part of problem solving. In other words, lifelong learning is now indistinguishable from problem solving (work). To facilitate bringing these tools to the desktop, we have built The Duke-NUS learner's portal that can be customized by the end user. Our goal is to enable folks to chase their curiosity without getting bogged down in the mechanics of chasing. So, someone with some artistic talent, but with little skill in actual painting might use photography, GIMP and some artistic filters to create impressionistic rendering of interesting photos. Photography captures an interesting scene, GIMP takes care of the mechanics of distorting this image in order to create some interesting artistic expression of an idea, while the internet and a browser provide entrance to art of others (examples). You are left with only the task of investing your own energy, enabling you to chase your curiosity.

Gene Stead's Web Site and Web Logs

In March, Josh and I visited Gene at the lake and over the next several months, built a small web site for him. John Williams became my surrogate at the lake. The web site was a struggle and Gene managed to delete email and other small things that cast a shadow on our work. Then we hit on the idea of a web log that seemed to provide a more robust tool for communication with his friends, colleagues and former house officers. The web log did not work out as spammers continued to clutter it up. The web site has grown to include some of his essays, and now a section for essays from house officers. This has been a very interesting challenge - and I am certain there will be new challenges and surprises.

Distractions - large and small

In Singapore, wall-to-wall people get to me. So I have found escapes. I can escape to South India, Thailand, Malaysia, Indonesia and China. Within Singapore, I explore the lives and behavior of small insects - mostly spiders. In Charleston, I discovered Nephila clavipes and in Singapore, there is a cousin, Nephila maculata. Recently I found that there exist kleptoparasitic spiders, specificaly Argyrodes flavescens. These little orange spiders eat the web of host spiders as well as steal their captured insects. Sharing this with my grandkids produced a web site: web site Enjoy.


This work is licensed under a Creative Commons License.

C. Frank Starmer