Print
Hits: 6788
Role of computers in a neuro critical care unit
The last few years has witnessed a phenomenal implosion in the field of critical care computing. We are in the age of the microchip. A secretary can spell a document in a few seconds. A ward clerk can produce a graph of a patient’s serum sodium value during the last five admissions, with a few key strokes. We take it for granted that a microchip will analyze a 100,000 heart beats every day and identify arrhythmias and asystoles. No longer are we impressed by microchips in pumps, ventilators and pulse oxymeters. Computer generated EEG reports are common place.

Interventional radiology
Superselective catheterization will be increasingly refined so that intraoperative embolisation will be a routine. Aneurysms are now being treated with coils by that new breed – the endovascular neurosurgeon who is more than an a interventional neuroradiologist. Fibrinolysis of acute stroke with urokinase or TPA has raised great expectations. Angioplasty or stenting for symptomatic carotid stenosis is now available in India. Intraoperative chemotherapy and stereotactic intratumoural chemotherapy will ensure high concentrations of antimitotics within the tumor. Culture and sensitivity will enable the right antimitotic to be chosen.

In a CCU, large volumes of data must be stored, processed and used for quick and repeated clinical decision making. Effective communication is vital in CCU. Networking with different departments ensures availability of lab data on a real time basis. Indecipherable handwriting will be a thing of the past. Using a modem and a PC, the neurosurgeon can make effective rounds from his bedroom. In a difficult case, instantaneous clarification can be obtained from a specialist in another continent, transmitting all the data. Computer systems normally wait till a request is made. If requested, the therapeutic aspects of various antibiotics – the specific bacteria it covers, relative effectiveness, complications cost, etc will all be displayed. Yet it cannot countermand an order for a drug that is totally inappropriate or even dangerous. Tomorrows computer with artificial intelligence in the form of neural networking, will be programmed to respond in different ways. A nurse in the CCU will require a computer generated order before the physicians order is implemented. The computer will take into account every known parameter, for the given patient while evaluating the physicians orders. A warning message will come on the screen – “Please check dose again” and the reasons will be displayed including the relevant citations. Systems which intercept orders before they can be executed are now available in several centers. For example, an investigation which is not standard, for the work up of a particular condition will not be transmitted to the radiology department. An extensively used program called help (Health Evaluation through Logical Processing) has been proved to be effective not only in making a diagnosis, but also in alerting the physician of avoidable problems. Treatment suggestions are also given. An integrated “Medical information Bus” is on the anvil. Linked to 255 devices, this network will intercommunicate in 73 seconds.

A program called meditel was tried in the pediatric ICU. The age, sec, symptoms, signs and a host of other information requested by the computer was fed in. The computer would then give a differential diagnosis with an uncertainty factor. Though accuracy was only 70% to 90%, it was found that with a computer assisted diagnosis, there was a trend towards shorter hospital stay, decreased use of consultants and less number of costly tests. Algorithms for Computer Assisted Diagnosis involves break points that represent all or none decision points. There is no room for “May be”, “Rarely”. However, ingenious programs using Bayesian logic are now available which can give differential weightage to a complex array of coexisting symptoms and signs. This can even include “Perhaps” and “Maybe”. In one such application on 331 patients with proved myocardial infarction, experienced physicians made the correct diagnosis in 79% of cases while the computer got it right in 92% of cases. Several similar applications will be available in the field of neurological sciences.

Neural prosthesis
In the days of yore, preachers of the gospel claimed that one day the son of God would descend from the heavens. The blind would then see, the deaf would hear, the dumb would talk and the crippled would walk! Neural prosthesis may make this a possibility. Ultrathin chips placed surgically at the back of the eye could work in conjunction with a miniature camera to stimulate the optic nerves. The camera would fit into a pair of eyeglasses, a laser attached to the camera would both power the chip and send its visual information via an infrared beam. The microchip would then excite the retinal nerve endings just like healthy cells, stimulating some sort of crude vision. Today cochlear implants are even available in India. Voice synthesizers will help the speechless talk.

Virtual reality in medicine
Virtual reality in surgical education will soon be routine reducing iatrogenic complications. The surgeons level of efficiency can be monitored. Since surgery is a series of tasks and each task is a series of steps it may be possible to use “Fuzzy logic” and even quantify surgical competence. VR simulators are increasingly becoming more complex. The “Green Telepresence Surgery System” consists of a surgical workstation and a remote worksite. At the remote site, there is a 3D monitor with dexterous handles with force feedback. The VR surgical simulator is a stylized recreation of the human abdomen with several essential organs. Using a helmet mounted display and data glove a person can learn anatomy from a new perspective by “Flying” inside and around the organs. Surgical procedures can be practiced with a scalpel and clamp. Innovative virtual reality techniques are now being used for quicker rehabilitation of physically disabled patients. Worlds previously non existent can now be “Explored” by the handicapped. The hospital of the future will be first designed and tested in virtual reality, bringing together the full power of the digital physician and his colleague in computer science.

The biomuse (conctrilling computers with neuralm signals)
Conventionally, the mouse and the keyboard are used as an interface to communicate with the computer. Using the body’s bioelectricity to activate a computer has fascinated scientists for the last two decades. It is only recently that trials were conducted. That minute electrical discharges are generated by muscles, nerves and the brain is well known. Recording these potentials from the skin surface is routine. However, one cannot simply attach sensors to a persons skin and plug the wires into the back of a conventional computer. Firstly the signals have to be amplified 10,000 times. Other circuits are then required to convert the amplified EMG signals to a digital form. These digitized signals are then processed to provide signals to a computer just like a mouse.

Similarly, the electrical current generated at the junction of the cornea and retina can also be used. Electronic circuits can detect the tiny voltage fluctuations on a persons face when the eyes shift in orientation. Using a headgear similar to a cordless mouse, these currents can be manipulated to move a cursor. Using “Fuzzy logic”, a person can position a cursor on a computer screen by moving his eyes. By using eye movements alone, letters on the computer screen can be selected. Though it takes time to even form words, ultimately documents can be created using one’s movements.

That the human brain produces measurable levels of electrical activity is well known. Measurements of electrical activity on the scalp can detect underlying electrical activity of neurons. For decades researchers have tried to correlate EEG signals with specific behavior patterns. Attempts are also being made to isolate specific EEG signals that can be adjusted at will. Most attempts to control a computer with continuous EEG measurements work by monitoring alpha or mu waves, because people can learn to change the amplitude of these rhythms by making the appropriate mental effort. Visualizing various motor activities such as swallowing, chewing and smiling can result in alteration mu activities. A computer cursor can be programmed to shift with changes in the amplitude of the measured mu waves. This is equivalent to a thought activated electronic switch.

Evoked potentials are signals in the brain which occur a fraction of a second after it is provoked e.g. Visual, auditory and so on. By keeping one’s gaze fixed on an appropriate square for a few seconds a person wired with scalp electrodes can convey a choice to a computer. The machine monitors the form and timing of the EP response and so can discriminate which of the coded flashes caused the evoked electrical activity of the brain. Ultimately one may even be able to unravel the specific electrical activity of the brain with specific thoughts. Directing neural communication with humans and computers may not always be science fiction. The computer of the third millennium may have biological signal sensors and inbuilt thought recognition software.

The chip of a computer can never reach the compactness of a neuron – a billion of which can be packed in one cubic inch. Growing nerve cells in culture media may one day be possible. Preliminary work has commenced to build a biological computer. Highly sophisticated electronics and circuitry will eventually give way to bio technology. The future of computing will be based on the ubiquitous DNA/RNA molecule. Protein based computers may some day replace the silicon chip.