C. G. Masi Technology Communications

Home
About Us
Technology Journalism
Technology Trends Library
Online Resources
Contact Us

For The Agency | For The Technology Developer | For The Magazine Publisher | For The Individual

Going with the Information Flow

Data acquisition and instrument control technology, long buried inside of automated laboratory instruments, can't be ignored anymore.

The following is a manuscript for an article published in R&D magazine. R&D magazine holds the copyright for the finished article.

C.G. Masi, Contributing Editor

Researchers involved in discovering and developing new pharmaceutical compounds generally don't have a lot of background in advanced semiconductor electronics. This we know.

They are chemists, biologists, medical doctors--almost anything but electrical engineers. They do what they do, and do it well, because they care about biochemistry, medicine, etc., not microelectronics. So, it's a bit unfair to expect them to dig into the bits and bytes of data acquisition (DAQ) systems, but that is just what the Universe is forcing them to do.

DAQ is nothing new in the biochemical laboratory, or any other laboratory for that matter. In the approximately 40 years since the folks at Digital Equipment Corporation (later absorbed into Compaq Computers, Houston, Tex.) made computers compact, affordable and easily programmable, data acquisition (DAQ) technology is what has made automated instrumentation feasible. (For more information on minicomputer history, see the PDP-11 website.)

Up to now, however, vendors of automated laboratory instrumentation have managed to hide their DAQ technology under the covers of stand-alone instruments. They did this for the simple reason that they were using DAQ to make life easier for pharmaceuticals researchers, not more difficult. Why force researchers to deal with the ins and outs of data acquisition systems when they have more useful things to think about, such as which compound does what and how well?

"That was okay," Wolfgang Winter, Product Manager for Data Systems for the Life Science Business Unit of Agilent Technologies, Waldbronn, Germany, points out, "as long as the DAQ systems were islands of information in the environment. Now, the pharmaceuticals companies have to relate all of these islands of information into their ocean of information."

That change is forcing pharmaceuticals researchers to come face-to-face with DAQ systems, and forcing instrument vendors to find ways to help them do it with as little pain as possible.

What is this DAQ, anyway?

Data acquisition is, quite simply, the art of using digital computers to collect and record measurement results electronically. It has two major advantages: it can note instrument readings much more precisely and orders of magnitude more rapidly than any human laboratory assistant could ever do; and it can store those readings in permanent digital electronic records that are readily accessible to any analysis software. A third advantage, which is often just as important, although not as universally exploited, is the possibility of tightly integrating data acquisition with automated controls.

Fig. 1 shows a DAQ system Brant Bergdoll, Senior Project Engineer with V I Engineering in Indianapolis, Ind. built for Eli Lilly and Company (Indianapolis, Ind.) to use in testing the reactions of laboratory animals to experimental drugs intended to dilate or constrict blood vessels. V I Engineering is a third-party integrator who specializes in designing and building custom DAQ systems for clients in a wide variety of industries--obviously including the pharmaceuticals industry.

Fig. 1: This hemodynamics test system illustrates the basic data acquisition elements: sensors to transduce physical quantities to analog signals; signal conditioning electronics to prepare those signals for conversion to digital data; and a DAQ card to do the conversion and post the results to a host computer, which sets down a permanent electronic record. Courtesy V I Engineering, Indianapolis, Ind.

 

The measurement process begins with a set of pressure sensors inserted through catheters into the heart and major blood vessels of the test subject (whom we'll call "Bugs"). Specifically, there is one sensor that monitors Bugs' left ventricular pressure and another that measures his blood pressure. A third sensor monitors Bugs' EKG through standard equipment. The EKG equipment returns its waveform to the DAQ system through an analog-signal output.

The two blood-pressure sensors are fairly typical fluid-pressure transducers, where the fluid pressure distorts a tiny membrane. A strain gauge measures the membrane's deflection. Such transducers are common, inexpensive and very, very fast and accurate. They do, however, require a source of DC excitation for the strain gauge. Bergdoll put in a regulated power supply built by Hewlett Packard (now Agilent Technologies, Palo Alto, Calif.) to supply this DC power.

The analog signals from the sensors go directly to a module installed in an SCXI chassis built by National Instruments in Austin, Tex. SCXI is a standard instrumentation backplane format, just as PCI is a standard format for computer backplanes.

A wide range of SCXI modules are available from a number of DAQ instrumentation vendors. Generally, SCXI modules are used to condition and multiplex analog signals. Conditioning consists of amplifying signals to improve their noise immunity, supplying any needed excitation power, and providing calibration corrections. Multiplexing consists of scanning the input channels to present them, one at a time, to the next component of the DAQ system.

That next component is commonly referred to as "the DAQ card." It is an electronic circuit that receives the conditioned analog signal from the SCXI chassis, converts it to digital information and presents that information to the host computer.

Many DAQ cards have signal conditioning and muliplexing circuitry built in. They are used for applications where the front-end signal processing is less demanding than in this electrodynamictester application.

DAQ cards are available that can communicate through any port on the computer. By far, the most common type of DAQ card fits into a slot on the computer's PCIbus backplane. DAQ cards communicating through USB and PCMCIA ports are also available and gaining popularity, especially in installations where laptop computers are used to save space on laboratory benchtops. Less popular, but still well represented, are DAQ cards communicating through RS-232 (serial) ports and other computer-peripheral ports.

DAQ cards generally write the data directly into the computer's random access memory (RAM) so that it is immediately available for processing by the host computer. In most systems, it is then the job of the computer's microprocessor (running a specially written DAQ application program) to do any initial data processing, display immediately needed real-time results, and make a permanent record as needed.

In the hemodynamics test system shown in Fig. 1, the immediately required real-time results are Bugs' heart rate, his left ventricular diastolic pressure and the time derivative of his left ventricular pressure. The DAQ application program calculates those immediately from the incoming data, and displays them on the computer's monitor screen. At the same time, it records them along with the raw data on the computer's hard drive. It can also upload data and calculation results to a central database over Lilly's network.

V I Engineering also wrote some post-processing software, which could run on the DAQ host computer or on an individual researcher's office workstation (using data downloaded from the network). This post-processing software is set up to calculate any of approximately 20 variables of interest, such as the area under the rising part of the blood-pressure curve.

The input for a DAQ system can be just about anything that can be quantified, and there is no effective limit on the number of channels, either. Jeff Steele, National Instruments Area Sales Manager in their Bridgewater, N.J. office tells of an application where the customer was trying to screen a large number of compounds for binding to a receptor.

"They want to do that quickly," he says. "The only way to do that is by doing it in parallel."

They achieve high parallelism in their test stream by tagging the bound receptor molecules with a fluorescent die, then presenting them to a machine-vision camera in a 96-well mitrotiter plate. Wells containing compounds bound to the receptors show fluorescence, and those containing compounds that do not bind stay dark.

They use a standard microtiter-plate handler to present the plates to the camera under a UV light. From there on, the hardware is pretty standard for a machine-vision application. The camera picks up the image, which it transfers to a frame-grabber board plugged into the host computer's PCIbus backplane. The frame grabber's job is to select one image out of the video stream coming from the camera (which it does when the robot signals that the plate is in position) and convert it to an array of numbers.

Looking at the system from a DAQ perspective, the camera stands in the position of the sensor. The frame grabber stands in for the DAQ card. The acquired image goes directly to the computer's RAM just like any other array of numbers representing acquired data.

This application converges into a clear DAQ application in the software. The software knows a priori what parts of the image correspond to each well in the plate. It then simply isolates the numbers representing the light levels seen in those pixels, and sums them to get a fluorescence level for each well. The 96 numbers representing the fluorescence from each of the 96 wells are then the output the system reports.

To write the program for this application, Bergdoll had to blend pure DAQ functions with image-analysis functions. That process was made easier by electing to write the program, which is shown in Fig. 2, using a graphical programming environment optimized for DAQ applications (NI's Lab-VIEW) that also contains a library of image-analysis functions (Imaq Vision). The application could also have been written in a more conventional programming language, such as Visual BASIC or C.

Fig. 2: Programming environments developed specifically for DAQ applications make software development easier. Courtesy National Instruments, Austin, Tex.

Linking the Islands

These custom applications are still DAQ islands of information. The real challenge comes when drug researchers want to combine disparate analytical techniques into a test-process flow. Such process-automation challenges have long been the norm on the production end of the pharmaceuticals business. With the expansion of massive drug-candidate screening efforts, they have reached all the way back to the discovery stage as well.

"Combinatorial chemistry, for instance, produces libraries of 100,000+ compounds," says Michael Swartz, Pharmaceutical Marketing Manager at Waters Corporation in Milford, Mass. "Scientists now have to look at each individual compound and test it against a particular activity for therapeutic use. They do that using a whole arsenal of analytical instruments, such as NMR, mass spectroscopy and chromatography. No one instrument vendor has control or data acquisition and manipulation for all of those instruments in one package."

Advanced drug-candidate-analysis techniques have taken on a hyphenated alphabet-soup surrealism that puts Depression-era Federal agencies to shame. Typical method designations include LC-MS-MS (liquid chromatography followed by mass spectrometry followed by more mass spectrometry), MUX-LC-MS-MS (multiplexed LC-MS-MS), LC-ICP-MS & LC-API-MS (LC followed by inductively coupled plasma dissociation of target molecules, followed by MS integrated with atmospheric pressure ionization--which adds charge while leaving the target molecules otherwise intact--followed by MS) and a host of other, similarly mind-wrenching, combinations.

The elements (LC, MS, ICP, etc.) are all sophisticated analytical steps run automatically from predefined methods. They have to be coordinated physically, temporally and electronically. On top of that, the software analysis systems have to account for the effects of all the twists and turns in the convoluted analysis procedures.

Finally, it all has to happen rapidly, reliably and repeatably--and without driving everyone, from the system integrators to the researchers wrestling with the results, crazy.

This tall order has driven instrument makers to become adept system integrators for their customers. The integration has to operate on three levels: physical flow of materials under test through the system, flow of electronic signals coordinating the different elements of the system as well as carrying test results back to the host, and flow of information through the software.

Micromass, a UK-based division of Waters, offers a system called ProteomeWorks, which combines sample preparation, instrument cleanup, chromatography, several types of mass spectroscopy, protein sequencing and cloning. The system's purpose is to start with a set of separated proteins and automatically process them through to final protein identification--and provide an intelligible report.

Since, as Swartz pointed out, no one instrument company has all of the pieces of the puzzle, they have to make alliances with each other to fill in the gaps.

A laboratory automation company like Camile Products of Indianapolis, Ind. is in a slightly different situation. They have a history of providing the glue to bind together complementary components from unrelated companies into custom laboratory systems. For them, its only a case of selecting the appropriate bricks to set in their mortar.

Camile's CLARK (Camile Laboratory Automated Reactor Kit) is an example of the software needed to bind the various functions together. It provides a graphical representation of the physical system with displays of instrument-control parameters and real-time-updated measurement values.

Fig. 3: Vendors of laboratory automation systems are used to integrating islands of automation into larger systems. Courtesy Camile Products, Indianapolis, Ind.

Camile's system consists of both hardware and software. The hardware is a set of boards that fit into a CLARK chassis, just as the signal conditioning modules fit into Bergdoll's SCXI chassis. The CLARK modules, however, perform the analog-to-digital conversion of the DAQ card and the digital-to-analog conversion needed to provide control signals to the analysis equipment. These boards communicate to the analysis equipment via analog voltages and to the computer via an RS-422 link. RS-422 is the multidrop version of the venerable RS-232 serial port available on virtually every personal computer. Additional system components might bypass the CLARK chassis by communicating directly with the host computer via the RS-422 link their own digital links.

"We recently did a project where we were bringing in data points from a particle analyzer and intefacing that with our Camile TG software," Connor reports. "In a typical batch reactor application there are not a lot of data points--maybe for 18-20 I/O points for the total system. When, you hook up a particle analyzer, it starts collecting thousands of data points per second."

That makes a big difference, but it is a quantitative difference, not qualitative. You can still use the same software and hardware technology, just more of the same channels.

Intranetworking

Pharmaceuticals firms now want to automate their whole laboratories. That ratchets the system-integration problem to a qualitatively different level.

Physically, it increases size of the problem by at least an order of magnitude. It also expands the volume of information that has to be moved and stored. Most significantly, the information output from the laboratory flows directly to the company's enterprise-wide information system. That moves it from the "instrument automation" world into the "information technology" (IT) world.

"The things that are mainstream in the DAQ world," Agilent's Winter points out, "are the exception in the IT world. Something as common as IEEE-488 in the DAQ world is the exception for the IT people. It gets difficult because the lab people have to justify using it. It is not 'mainstream.' It is different from the office environment.

"That was okay as long as the DAQ systems were islands of information in the environment. Now, the pharmaceuticals companies have to relate all of these islands of information into their ocean of information. We are talking about networking all of these DAQ systems so that they report all of their data into a central data repository."

Agilent decided to standardize on networked data systems quite some time ago. That means offering analysis systems based on IT-standard interfaces, such as an Ethernet based network connection using TCP-IP, which is just beginning to be done in the DAQ world. Being able to support industry standard protocols with the instrumentation itself avoids having the DAQ system become an island of information in the first place.

These networked data systems can control multitechnique instrumentation, like LC-MS, GC-MS, UV/VIS spectroscopy, capillary electrophoresis, etc. There are even general-purpose interfaces that would allow you to capture digital output from just about any other device that exists. You can hook all of these up to the networked data system. The networked data system will control the instrument, pick up the data, interpret the signals and then spit out numeric results upon which some kind of decision can be made. The software needed for one instrument is called an instrument driver. The instrument driver includes software modules for instrument setup, control and data collection.

Fig. 4: Networked data systems ease the system integration problem by leveraging standard client/server network technology. Courtesy Agilent Technologies, Waldbronn, Germany

The instrument setup modules include a graphical user interface (GUI) that gives you access to the parameters that are relevant on that instrument. The GUI for a diode array detector, for example, would set parameters like wavelength and bandwidth. For units that have more than one lamp, it would determine which lamp is to be used. The GUI might also set a stop time, and whether it should run an automatic balance routine at the beginning of the run.

Since the instrument drivers are modules within the same family, you can plug them together to fit exactly the system that you have. The important bit is that they can all be connected to a central data repository, such as an Oracle database, that can pull together all of the data that is being measured in the analytical lab. That is useful not just for archival purposes and for backup, but for also for correlation--putting pieces of data together that have been measured over time on different instruments by different people.

"When I started in this business," Swartz recalls, "we had our strip-chart recorder hooked up. It drew a red or a blue or a black line on paper, and that was it.

"Now, we use workstations. We refer to them as clients on a large network. A lot of our major customers have client-server networks with point-of-use PCs that log onto servers and the applications that are running are centrally managed."

 


Home | About Us | Technology Journalism | Technology Trends Library | Online Resources | Contact Us
For The Agency | For The Technology Developer | For The Magazine Publisher | For The Individual


© , C. G. Masi Technology Communications, Privacy Policy
P.O. Box 10640, 978 S. San Pedro Road, Golden Valley, AZ 86413, USA
Phone: +1 928.565.4514, Fax: +1 928.565.4533, Email: cgmasi@cgmasi.com, Web: www.cgmasi.com
Developed by Telesian Technology Inc.