Happy molecular biologists are all alike; every protein chemist is unhappy in his own way. Purifying nucleic acids, with their comparatively uniform chemistry, simple building blocks and predictable structures, is often tedious, but usually straightforward. In biomolecule purification, if you have seen one nucleic acid, you have seen them all. Proteins, with their astonishing chemical and structural diversity, are a different story. If you have seen one protein, you have seen one protein.

In the past few years, the rise of proteomics from buzzword to bona fide discipline has simplified protein chemists' lives considerably. Nonetheless, a few challenging classes of molecules still resist easy, large-scale purification. In a cruel twist, these hard cases include many of the proteins biochemists and drug developers are most interested in studying: membrane- and organelle-bound signaling molecules, post-translationally modified proteins (Box 1) and scarce hormones diluted in complex biological fluids like serum.

Protein chemists have not given up, though, and gradual improvements in chromatography and separations are chipping away at some of the field's longstanding problems. Meanwhile, the rapid evolution of new analytical technologies, especially mass spectrometry, is simplifying protein purification by allowing researchers to analyze dirtier samples.

The revolution will be ionized

“If you talk about how [have] the chromatography and tools associated with chromatography improved, I think there have been incremental improvements,” says David Speicher, director of the proteomics facility at the Wistar Institute. Although soluble-protein purification has become relatively straightforward, Speicher says membrane-protein purification has lagged behind.

Indeed, biochemists working on soluble proteins have been deluged with handy new separation technologies recently1 (Box 2). New liquid chromatography columns from PhyNexus are built to fit standard pipettors, reducing sample volumes and shortening run times. High-performance liquid chromatography (HPLC), long a workhorse of fast protein purification, has been mechanized by GE Healthcare, with its automated AKTAxpress system. And Waters has upped the ante—or at least the pressure—even further, with ultra-performance liquid chromatography (UPLC) that promises to shorten processing times and boost resolution.

Chromatography's merry-go-round still in fashion. Credit: ©iStockphoto.com/Gannet77

“UPLC is something that's certainly on the horizon,” says Gary Siuzdak, director of the Center for Mass Spectrometry at the Scripps Research Institute. None of the new technologies, however, are especially good at separating transmembrane proteins, a class that includes many of the most important signaling molecules in the cell.

The problem frustrates both biochemists and product vendors. “Although we offer a membrane-protein fractionation kit for rapidly isolating these proteins from cells, we recognize that membrane proteins represent a significant challenge for purification and analysis,” says Greg Hermanson, director of technology at the Pierce division of Thermo Fisher Scientific. Hermanson adds that “many transmembrane proteins are highly unstable after extraction and undergo conformational changes and denaturation, even in the presence of detergents and stabilizing agents.”

Companies like Thermo Fisher are working on new strategies, but for many scientists, the quickest solution may be to drive around the problem. “The chromatography tools haven't changed very radically. The radical improvements are on the protein identification side,” says Speicher. In just a few years, he explains, mass spectrometers have evolved from distinguishing a few dozen proteins in a sample to distinguishing hundreds. The machines cannot quite handle a crude cell lysate yet, but biochemists can now obtain clean data from surprisingly dirty samples.

Speicher recalls for example a project for which his group wanted to identify ligands that interacted with a specific membrane protein. Unfortunately, the researchers were unable to get enough pure protein to analyze, just faint bands of potential ligands on a gel. The team abandoned the project and took a different tack, “but today if we had that gel, we'd just cut the band out and pop it in the mass spec, and we'd have a protein identification, and if it turned out that it was a contaminant, we'd know it immediately, or if it was a specific ligand we'd get that identification very early,” Speicher explains.

Nonetheless, researchers need to be careful not to overinterpret mass spectrometry data, especially with impure samples. “You cannot compare it to what we have in nucleic acids. If you have a negative in a PCR, you can say 'this mRNA is not expressed'. But if you can't see something in the proteome, it may just be below the limit of detection,” says Kai te Kaat, global business director for proteins at Qiagen. He adds that “from a proteomics technology standpoint, we still have a way to go to really make a negative result not something I can't see but something that isn't there.”

Pieces and parts

Although sophisticated mass spectrometry systems may allow proteomics researchers to avoid some of the challenges of classical purification, even mass spectrometrists agree that this approach will not work for everyone. “If you're ultimately looking for structural or really detailed information on a particular protein, ultimately you're going to have to go the purification route,” says Siuzdak.

The traditional way to purify a difficult protein to homogeneity is to develop a system that overexpresses it, then enrich it through a series of columns and other separations. Although this outline has remained essentially unchanged for decades, a few recent improvements can accelerate the process. For example, many suppliers offer expression vectors that can be shuttled easily between Escherichia coli and more complex organisms, drastically simplifying—or even eliminating—the tedious process of vector construction.

“Companies are selling libraries of cDNAs in expression-ready systems, so if you're lucky enough to be working with a protein or a set of proteins where they're commercially available, you don't even have to make the expression vector, you just buy it,” says Speicher.

But even with an expression system in hand, purifying a membrane- or organelle-bound protein can be a nightmare. Biochemists typically start by fractionating cells into their cytosolic and membrane compartments, using classical techniques like density-gradient centrifugation. Unfortunately, the centrifuge cannot distinguish one membrane from another.

As te Kaat explains, “If you look at membrane-bound proteins, what you usually get in homebrew methods...is all membrane-bound proteins, so this means you have endoplasmic reticulum, [and] you have the organelles,” as well as plasma membranes. Worse, centrifugation techniques are notoriously touchy, so results often vary from one experiment to the next.

To address this problem, Qiagen now offers subcellular fractionation kits, including one specifically for plasma membranes. “What this kit actually delivers is a very, very high enrichment of the plasma membrane with, for example, almost complete absence of endoplasmic reticulum molecules,” says te Kaat. The kits use proprietary buffers and column resins, relying mostly on affinity purification to isolate the desired components.

Thermo Fisher also offers an extensive series of kits and reagents for subcellular fractionation. “We now allow researchers to fractionate cells into cytoplasmic, nuclear, membrane, mitochondrial, lysosomal or peroxisomal fractions,” says Hermanson, adding that in many cases, the company's kits yield intact organelles that can then be studied in vitro or subjected to further purification. The kits have been big sellers, and Thermo Fisher is working on introducing more.

Even researchers who do not need highly pure proteins may find organelle fractionation useful. “Another important aspect to subcellular fractionation is that it provides important physical context to proteomics analysis. It not only facilitates the identification and characterization of proteins, but it generates information regarding where they naturally exist within a cell,” says Hermanson.

Blood work, with less sweat

In addition to separating cells into their individual components, biochemists have also struggled with the problem of fractionating biological fluids, especially serum. For clinical research and new diagnostic strategies, measuring changes in blood-borne hormone levels is often the order of the day. Unfortunately, these molecules are usually present in vanishingly small quantities, immersed in a stew of vastly more abundant but much less interesting proteins. Load serum onto most liquid chromatography columns, for example, and the result will be a column clogged with albumin.

To answer this challenge, tool makers have been rolling out specialized resins that bind common proteins like albumin. Overall, the products have gotten good reviews. “One of the things that we're doing is looking for plasma biomarkers, and one of the incremental improvements there...was the first commercial products which utilized immunoaffinity to deplete abundant proteins,” says Speicher, adding that “we've found that to be greatly enabling.” The new immunodepletion columns have been available since about 2003, and companies like Sigma-Aldrich and Agilent now offer extensive immunodepletion resin product lines.

“Because of the success of removing the 6 top proteins, there's been interest now in removing the top 15 or so proteins, [because] when you reduce these top-level proteins, it allows you to delve further into the proteome,” says Siuzdak.

The new resins are not a panacea for plasma proteomics, though. In particular, manufacturers have generally favored products that deplete human serum. “Mouse immunodepletion has really lagged behind,” says Speicher, adding that “there are only a few products, and they don't work as well as the human products do, and there seems to be little commercial interest in developing that.” Researchers using non-mouse animal models may be entirely out of luck. “That's going to be very much a niche, and given the expense of developing these immunodepletion columns, I would be skeptical that companies are going to,” says Speicher.

One of the available subcellular fractionation kits focusing on mitochondria. (Courtesy of Pierce-Thermo Fisher.)

For those working on human serum, Speicher and Siuzdak say the columns are quite useful, provided one keeps their limitations in mind. Like most liquid chromatography techniques, immunoaffinity absorbs some molecules nonspecifically, so the method is not strictly quantitative, and proteins present in very low concentrations may simply vanish. When looking for relative changes in the levels of signaling proteins, however, immunodepletion can work very well.

Learning to love 2D

No matter what kinds of proteins a laboratory studies, if the goal is a proteomic survey, sooner or later two-dimensional (2D) gel electrophoresis will come up. Indeed, in the 12 years since the term “proteomics” first appeared, separating all of a sample's proteins on a 2D gel has been a hallmark of the field. For just as long, vendors and engineers have been promising that this troublesome technique is on the verge of being replaced by something better.

The motivation to replace 2D gels is strong. The standard technique requires pouring, running and slicing large polyacrylamide gels manually, with ample room for errors and variation at each step. Results commonly differ widely from one experiment to the next. Nonetheless, 2D gels have established a durable niche. “The 2D gel approach isn't going to be going away anytime soon,” says Siuzdak, echoing a widespread sentiment among researchers.

Mitochondria protein enrichment results using the QIAGEN separation kit. (Courtesy of QIAGEN.)

Several companies have tried to automate 2D gel handling, but that approach has been slow to catch on. “Most people who are doing 2D gels are still doing them the old manual way,” says Speicher. Chip-based and column-based alternatives are also in development, but so far, none have managed to supplant the gels.

The technology has evolved somewhat in recent years, though. In particular, the differential in-gel electrophoresis (DIGE) technique is helping many groups overcome the legendary reproducibility problems of 2D gels. In DIGE, the investigator pools aliquots of all the experimental and control samples and labels them with one fluorescent dye, creating a reference sample. Another aliquot of each experimental and control sample is then labeled individually with another dye. After running the individually labeled and pooled reference samples on a 2D gel, researchers scan the gel for the fluorescent labels. The result is an experimental pattern and a reference pattern, the latter providing alignment points for comparisons between gels.

“So if I have a gel-to-gel variation...that's going to happen with the reference as well as experimental samples,” says Speicher, who adds that “my experience in talking to colleagues is that...an increasing percentage of people doing 2D gels are migrating to that approach.”

Tool makers are still working on ways to simplify 2D gels or eliminate them entirely, but they concede that quick solutions are not just around the corner. “2D gel separations still offer important information on protein modification, such as phosphorylation, glycosylation and peptide cleavage that is difficult or impossible to evaluate by [mass spectrometry],” says Hermanson.

That theme is typical of the remaining 'difficult' problems in protein chemistry. Although researchers still hope the field will have a big break, in the meantime they're optimistic about the incremental evolution of present techniques. “The combination of these new technologies is...going to far surpass what we've done in the past, and I think we're going to see a lot of very interesting things coming out,” says Siuzdak. See Table.

Table 1 Suppliers guide: companies offering products for separation and chromatography