Wednesday, January 22, 2014

My electronic lab book

You may or may not have any experience with electronic lab books.  Many of the "better" ones are meant to be integrated with some sort of LIMS (Laboratory Information Management System/Software) which may or may not cost a lot of money and may or may not be useful to your specific needs (for a real joke check out LabBook Beta on Play Store).  Personally, I have tried to digitize various parts of my lab life for year, but I always come back to paper and pen, and securely taping important items (product inserts, gel photos, etc) into my notebook.  As a result, I now have numerous notebooks that span all the way back to 2001.  Since my notes are organized by date, I can usually recall approximately when something was done that I need to reference, but it can take some time to go through everything to find what I need.  I also have witnessed several other people do one task or another on the computer and find their lab notes scattered among excel files, google docs, and the traditional lab book.  So I have been looking for an electronic notebook that is as similar to paper and pen as possible, and may allow for better organization.  Most importantly, it has to feel natural.  If I am forcing myself into the e-notebook exercise, it isn't going to work well and I will be back to paper pretty soon.

I've had a smartphone for about a year now, so I am familiar with the Android OS.  I also have an ipod video that ran faithfully from 2006 until recently, and occasionally help people out who prefer Mac OS.  Given the various issues getting the ipod to play nice with Windows and Linux, and my recent positive experience with Android, I was pretty sure I should go for Android.  Also, it hurts the pocketbook less.

The tablet.  Elegant-looking Samsung hardware.
I settled on a Samsung Galaxy Tab 3 10.1.  I got a refurbished device off Newegg for about $300 with shipping, and simultaneously purchased a cover, stylus, and screen protector.  The cover was another $15, the stylus $25, and the screen protector (pack of 3) was $6.  I had played around with some software using my phone, and planned to use the popular free app Papyrus (looks like a paper notebook) to test drive the new tablet.

Then everything arrived, and I learned a few things...  First, the tablet I purchased has only a capacitive screen.  These are far better than their resistive predecessors, but do not have the stylus functionality of a Galaxy Note series tablet (a few other manufacturers as well).  The note has an integrated stylus called an S-pen which is a digitizing device.  When you enable S-pen functionality in your handwriting software, the screen no longer responds to your finger touch as a means of "palm rejection."  Unfortunately, I had purchased an S-pen stylus that was totally incompatible with my capacitive screen.  And how was I going to make this thing work anyway?  I went to Staples and picked up a Wacom Bamboo Alpha stylus for $15 which seemed to have a finer point than most other capacitive styluses, was think like a pen, and had decent user feedback online.

The cover.  Wakes and sleeps your device when opened or closed.
Unfortunately, I could use my chosen app (Papyrus) for writing only if I also kept a small piece of bubble wrap present to insulate my hand from the screen.  As I wrote across the screen I would have to stop and adjust the position of the bubble wrap.  This is not practical, and I was doubting myself already.  So I went through Google Play store and downloaded free versions of other possibly useful handwriting apps with decent reviews.  If they didn't have a free version to test, I just ignored them since I can't spend university money on apps that might be completely useless (hint hint, Devs).  I tested Papyrus, FreeNote, INKredible, and LectureNotes.  As I mentioned previously, Papyrus lacked palm rejection for a capacitive screen.  Same with FreeNote and INKredible, although INKredible definitely felt really nice when writing.  Hard to explain, but you need an app that lets your brain respond like it would to the physical act and immediate feedback (seeing your written strokes) of writing on paper.  The ONLY app I tested that has a useful palm rejection function is LectureNotes.  Luckily it writes well also.  There are a lot of people online disparaging the use of a capacitive screen, or even the functionality of palm rejection in LectureNotes, but I tell you it works very well.  Many people online suggested downloading a free app called TouchscreenTune to adjust the sensitivity of the screen to improve the palm rejection, but all this app does for me is open briefly before crashing, so it was no help whatsoever.  I did need to go out and purchase another stylus.  For $30, I picked up a Jot Pro by Adonit.  This is the only capacitive stylus you will find that has a usefully small tip.  It is embedded in a plastic disc that allows you to see your writing and not damage your screen.  A little strange at first, but you forget it's there pretty fast.  Adonit has a new stylus called the Touch which has Bluetooth functionality and an onboard accelerometer to yield pressure sensitivity and better palm rejection, but the advanced functions don't work for Android (yet), only for iOS (ipad).  It is unclear if the company (or other app Devs) have any intention to port these functions to Android.

The stylus.  It's magnetic and sticks to the back of the cover.
Almost all the pieces were in place, but I still didn't have a completely functional electronic lab book.  I do DNA work, so I run a lot of gels that I am accustomed to taping into my notebooks.  Also, I wanted the ability to export my notes to the cloud so that I could share specific notebooks with collaborators.  This turned out to be pretty easy.  LectureNotes ($4.50 or so for full version) has a splendid amount of available customizations.  I can export each notebook as a pdf, and specify the destination folder in my directory structure.  Then I use a second app called FolderSync ($2.50 or so to get rid of ads) to sync the contents of that directory with a cloud service.  I chose Dropbox since I got 50GB free for purchasing the tablet, but I would probably use my Ubuntu One or Google Drive account instead if I hadn't had that resource.  FolderSync can use each of these services and many more.  After adding the computer I use to take gel photos to dropbox, I can now import gel photos by telling LectureNotes to import photo from...  Then I choose Dropbox and browse to the new photo, resize, and move it to the position on the page I want and done!!  In order to upload my notebook to the cloud, I still have to physically choose "export" in LectureNotes, but this goes pretty fast.

And now I have something that is working.  Certainly a Note series tablet (or other device with active

stylus capability) would be better suited to my needs, but they are still pretty expensive.  I find myself already coveting the 12" Note that Samsung recently announced for release in the next few months, both for the increased real estate as well as the active stylus functionality (S-pen), but I expect this device to cost at least $700.  So to recap, you absolutely can use a capacitive screen and stylus for your lab book (detractors, please sit down!).  The tablet hardware may be important to my success, so I wouldn't count on a much cheaper device functioning as well.  I am using a Samsung Galaxy Tab 3 10.1 with LectureNotes (using heuristic palm rejection at 6000ms delay) and Adonit Jot Pro stylus.  With FolderSync my notes are synced as pdf files to my Dropbox account for sharing with collaborators.  Happy e-notebooking, scientists!!

A shot of some notes I took today in LectureNotes, complete with gel photo.  My handwriting isn't much worse on the tablet than on paper.  I was also able to import the pages I had managed to produce previously in Papyrus by exporting them as .jpg and importing them as images to new pages I placed before my existing page

A shot of my old lab book for comparison.

Saturday, November 2, 2013

Remote Desktop Connection from Ubuntu

I have been enjoying Ubuntu Linux now since 2008.  Like many, I didn't see it as a viable replacement for Windows as I still require the suite of MS Office programs in order to functionally collaborate with colleagues.  As of 2011, I acquired a netbook which came with Windows 7, but was far to puny to run this OS properly, let alone drive normal programs once it came up.  Before long, I ditched Windows from this machine in favor of Ubuntu (11.04 at the time, now using 13.10).  This isn't a super laptop, far from it, but it is nice to know there is an OS I can handle with my ultra portable, bombproof netbook.  The thing has a 32GB SSD, 2GB SDRAM, wifi, and a 8.9" screen.  I used the xrandr command to build a short script to modify the lame 1280x600 native resolution of the screen to a more comfortable 1368x768, and with Libre 4.1 and access to the offerings of Google Drive, I am more compatible than ever with my Windows/Mac-loving colleagues.  And still, I can't shake Windows as I have several machines I use at work.

Until an Ubuntu edition of MS Office is available (ever?), I probably will never get away from Windows, but today I found one more reason to need Windows even less.  I was bouncing from computer to computer today taking data from various places and consolidating everything in Google Drive spreadsheets.  Once collated, I will then need to send my data to my on-campus Windows image where certain statistical packages reside (JMP, SAS).  I was busy all day with lab work and lamenting that I was going to have to come back in tomorrow to do this work, or else stay very late.  If only I could run my stats from my couch...

This is when I discovered rdesktop, an easy to use client for connecting to a Windows Remote Desktop Connection from an Ubuntu computer.  From Ubuntu, it is easy to install (probably in the software center too):

sudo apt-get install rdesktop

It is a small application and installs quickly.  You are almost done...

From the terminal (doesn't come up in the dash), type

rdesktop servername

(for me, rdesktop vlab.nau.edu)

You should be at your familiar login screen.  For other NAU users, you will need to click the "other user" button and change your domain (eg NAU\username) for login.  However, the native rdesktop window is uncomfortably small, and for some reason, window resizing is not an option.  Fortunately you can use the -g option to set a specific resolution (say, -g 800X600) or a percentage of your screen size.  I like the percent option and found 90% to work best in most cases (with the scaling, a portion of your window can protrude into neighboring workspaces at >95%).  So, now I login as follows:

rdesktop vlab.nau.edu -g 90%

But that is too much to type, so I wrote a little one line script to fill in the details for me.  In my local scripts directory I did the following:

nano vlab

This puts me into the text editor nano and starts me editing the new file called vlab.

Add the following text to the text editor:

     #!/bin/bash

     rdesktop vlab.nau.edu -g 90%

Hit ctrlX to exit nano, saving as you go.

Change the permissions of the file:

sudo chmod a+x vlab

Test your script locally:

./vlab

If it works, copy the script to your bin directory so it will be called no matter your working directory:

sudo cp vlab /bin/

That's it.  Now I can go home and run my stats, and all I have to do to get the thing running is open a terminal and type:

vlab

Best of all, I can now access a Windows computer running remotely from my Ubuntu Linux netbook, giving me one less reason to need/desire Windows on my portable computer.  Of course someday I will graduate, but that also means I could purchase a competent desktop computer in the future, keep it at work (or at home if workplace firewalls are too cumbersome), and access all my Windows needs from elsewhere, and negate any need to maintain synced cloud accounts (Dropbox, Ubuntu One etc) for my workplace documents.

Happy remote desktoping, Ubuntuers!

Wednesday, May 1, 2013

Mystery of pH change when flying

I have a little story to relate here, and I would be interested to hear back from anyone who has an idea what is happening.

It all started last summer when our lab took on a project for another lab at a different institution.  The researcher shipped me their DNA in plates, plus primers and instructed me to perform multilocus genotyping on the roughly 600 samples.  Upon receipt, I ran a quick PCR check of a few samples for each locus, and everything looked beautiful so I tossed the project into the freezer intending to process everything in a week or two when I knew I would have time to devote some time directly to this job.  When I got back to the project, I ran the same quick PCR check just to be sure, and this time nothing really worked.  Perplexed, I repeated the exercise, thinking that perhaps I forgot to add something crucial, but again the same non-result.  I spent the next two weeks frantically troubleshooting this project, hesitant to contact the client since I had no idea what had happened to once perfectly good DNA that had only been opened once and had been placed in a freezer with no temperature fluctuations.

Did I contaminate the DNA with some degrading compound in the brief period I had it opened?  This seemed unlikely since I do this process all the time, using the same lab practices I used during these PCR checks.  Eventually I contacted the other lab and they sent me more DNA to work with.  When I received that shipment, I processed all of the samples immediately in fear that they also would degrade.  During this processing, I stumbled onto a bit of evidence about what may have happened.  In my PCR mix, I use phenol red as a colorant.  This is also a handy pH indicator which is a lovely dark red above about pH 8, but goes to an alarming yellow when the pH drops.  I was doing small PCR reactions (4uL in 384well plates), so I had 3uL mastermix in a plate to which I was adding 1uL DNA.  I added some DNA to a set of these reactions, and watched as they immediately changed from red to yellow.  Immediately I took a few microliters of a sample and streaked it across a pH strip -- pH 5!!  This prompted me to inquire to the other lab about the method used to extract the DNA and the buffer in which it was stored, etc.  Samples were all extracted by the popular Qiagen kit, but this was actually done at a third lab so they weren't sure of the storage buffer.  I was put in contact with the next lab, and they claimed the DNA was always eluted in Tris-Cl pH 9.0 (hey, my favorite buffer!!).  I insisted this couldn't be the case and wondered if they had accidentally used nanopure water from an RO source or something that might actually have such a low pH, but they stated otherwise, and there was no use arguing anymore.  I finished processing the samples and put the whole mess behind me, thinking it would always remain a nagging mystery.

In November I traveled to another lab to learn a technique for a new instrument we had received.  As a part of this exercise I brought some DNA with me that I had prepared for the process, but paranoid as I can be, I decided to bring all the pieces of my chemistry along in case anything went wrong.  These pieces included several plates containing PCR reactions containing phenol red.  Everything traveled with me in my luggage in an insulated container with samples a bit of dry ice.  I inspected the contents upon arrival, and all seemed in order, so I tossed them into the freezer.  The next day, I prepared to process these samples, retrieving them from the freezer and allowing them to thaw.  I was making some notes into my lab book and picked up a plate to check if it had yet thawed and was horrified to find everything had gone to yellow!

"Not again," I thought.

I frantically started looking for some Tris buffer to add to bring the pH back to where it should.  Surprisingly, the lab I was in had none on hand, so I headed down the hall, bothering anyone I found in a lab for a little Tris.  I located some within 10 minutes, took an aliquot into a falcon tube and headed back to my precious samples.  I grabbed the first plate, and just before I tore the foil seal off, I saw the wells had gone from yellow back to red.  What the hell??  Upon closer inspection, I saw this was only the case in a few of the wells that I happened to have opened briefly, and thus exposed to the atmosphere before resealing.  Curious, I tore the foil seal off, put a new seal on, vortexed, and spun my plate down.  Now all the wells were back to red.

So exposure to the atmosphere seemed to have solved my pH problem.  So what can pH do to DNA?  DNA is actually a pretty stable molecule (read up on the RNA world hypothesis for why it is so stable).  It is an acid and as such, is most stable in a slightly basic buffer solution (that's why we love Tris so much).  However, raise the pH too much, and the bases no longer pair (e.g. alkaline denaturation as in Illumina preps), or decrease the pH too much and other bad things start to happen.  Low pH, I have read, leads to depurination (loss of A or G bases) of your DNA strands, effectively fragmenting DNA into unusable bits (no longer than about 30 nucleotides).  How low does it need to be?  In theory, anything acidic will contribute to this effect, but the more acidic you get, the more rapidly this will occur.  If my chemistry background serves correctly, things will really start to change as you approach the pKa of DNA, which is somewhere around pH 5.0 -- right about where I had measured the pH of the DNA from the project last summer.  Storing DNA in water, rather than a buffered solution is known to be less ideal than the buffer, and this may be related to the natural dissolution of carbon dioxide as carbonic acid from the atmosphere into standing water, but the pH of such water is generally measured around pH 6.7 or so, not terribly acidic at all.

So what could be happening here?  When you place samples into a plate sealed with foil, there is slow evaporation/sublimation of your storage buffer over time, presumably due to slow air exchange through the adhesive layer holding you foil in place.  During an average airplane flight, despite the pressurization of the cabin, everything on the plane is at a markedly lower pressure than when the plane is on the ground.  The gaseous contents of the cabin aren't terribly different than what you find at sea level, otherwise there wouldn't be enough oxygen to remain conscious at 35,000 ft.  So, low partial pressure of gaseous components, and subtle permeability of your sealed plate.  This should actually release dissolved gases back into the atmosphere, and the loss of carbonic acid should actually raise the pH.  But that's not what I saw, and everything can be fixed by thawing, removing the seal briefly, and applying a new seal upon arrival at your destination.  So problem solved, but what was the problem in the first place?

Anyone??

Monday, March 18, 2013

Bead cleanups

This post, which discussed results published by Rohland and Reich (2012), has been removed at the request of Beckman Coulter legal counsel.

Friday, February 22, 2013

DNA precipitations

Haven't posted in a while, so this will be quick.

People ask me about their protocols on a pretty regular basis.  Even if they don't ask specifically, I often ask to see their workflow so I can tell if I am giving reasonable advice in the context of their whole experiment.  One of the most common things I encounter is the insistence on putting samples in the freezer to "help" with precipitating their precious DNA.  However, this idea should have been laid to rest almost 30 years ago now with the publication of "Ethanol Precipitation of DNA" in Focus (Fall 1985, Vol 7, No 4) by Zeugin and Hartley.

The key findings of this paper are that precipitation is less efficient at low temperatures than worm temperatures, length of incubation time has minimal effect except for very dilute samples, and centrifugation time is the most important factor in planning your precipitation.  The precipitations in this paper were NaOAc/Ethanol precipitations with the final NaOAc concentration being 0.3M (1/10 vol 3M NaOAc) and the final EtOH concentration being 75% (~2.5 vols EtOH).

The authors concluded that cold incubation is not beneficial to the precipitation of DNA and can even be counterproductive if your sample contains only a small quantity of DNA.  They speculate that decreased temperature will increase the viscosity of the solution and inhibit the motion of DNA through the ethanol solution during centrifugation.  I agree with this idea, but I would also speculate myself that by removing energy from the system, you simply slow down the kinetics of precipitation where free cations associate with the ribo-phosphate backbone while ethanol drives water molecules out of the double helix structure resulting in precipitation of DNA in a salt form.  Certainly during centrifugation such interactions will increase, so lately my thinking has been that centrifugation is the only thing that matters much during your precipitation.  This is also supported by their data where they looked at length of centrifugation time on precipitation.  Again, high concentration samples precipitated readily while low concentration samples needed more time, but most concentrations seemed to taper toward maximum recovery at 30 min, which unfortunately, was the longest centrifugation time they tested.

For me, this has resulted in the following being my standard precipitation conditions.  I prefer NaCl over NaOAc because I have some loose data indicating better recovery (not shown).  I continue to use NaOAc during EtOH precips of cycle sequencing reactions.

1) Add NaCl to 0.15M.  No need to be exact here.  Use a 5M NaCl stock and you won't need to add much so the volume compensation is unnecessary.  For instance, if you have a 300uL sample, calculate as follows:  (300uL*0.15M)/5M = 9uL.  So just add 9uL 5M NaCL and save yourself the headache.  It's not worth it!!

2) Add 2.5 vols EtOH.  Many protocols say add 2-2.5 vols EtOH.  I think this is problematic since if you use 2 vols, your final EtOH concentration will be only 66% whereas at 2.5 vols you achieve about 71%.  This may seem subtle, but my experience (admittedly anecdotal here) has taught me 2.5 is always better.

3) Mix sample well, and place directly into the centrifuge.  For samples with decent concentration, 30 min at maximum speed.  For low concentrations, longer is better, so maybe 30 min to 60 min (your choice).

4) Immediately after the centrifuge stops, remove the tubes and decant into the sink (plates can be GENTLY centrifuged inverted on a paper towel at low RPM, low acceleration/deceleration for ~10 sec to remove supernatant as per ABI sequencing protocol).  If you happen to be out of the room when your centrifuge stops, start it again for 5 min or so to ensure your DNA is well-adhered to the wall of your tube.

5) Add 70% EtOH.  For 1.5mL tubes I usually do 500uL, for 96 well plates, I usually add 50uL.  Spin again at max speed, but only 10-15 min is necessary now.

7) Decant again as in step 4.  Dry samples down in a vacuum centrifuge, or place on a heat block (~55C) for 10 min to evaporate any residual ethanol.  If you have left over EtOH and you try to run a gel, your sample will maddeningly just float away.  It can also inhibit downstream enzymatic steps.

8) Resuspend in your favorite solution.  Some people like water, but since the pH of "pure" water is often a bit acidic, depurination is a real threat to your sample.  Some people like TE, but even small amounts of EDTA can inhibit a PCR reaction.  For these reasons my go to solution is Tris-Cl pH 8.5, 10mM.

So, I hope someone finds this useful.  Happy precipitating!!



Wednesday, May 23, 2012

PCR: not so much on Tuesdays

It's been too long since my last post, but I've been busy.  The semester is over, written and oral comps passed, and I can return to being productive with some tinkering on the side.  I've recently been beset by updating some equipment usage logs which isn't hard, but takes up time I'd rather be using to do other things.

When looking at thermal cycler usage for the spring 2012 semester, one interesting thing pops out: Tuesday instrument usage is markedly lower (~40%) than other weekdays.


The numbers are hours used per day of the week from January through May.

What can account for this?  Impulsively, I think a couple of things contribute.  First, people come to lab on Monday with ambitions for the week and maybe some fresh thoughts contrived over the weekend about some result or another.  Once these results are obtained, perhaps some other steps or reflection on these new results are necessary before proceeding, hence the dip on Tuesday.  I think people then "push" to get the rest of their planned work done across the rest of the week, thus maintaining regular instrument usage, even on Fridays.  A few of us obviously spent some fun weekends in lab, too.  Someone I showed this to suggested that perhaps class schedules impede research on Tuesdays, but since Thursdays are well utilized, I think that seems unlikely, given the Tues/Thurs or M/W/F scheduling of classes here at NAU.

So there it is.  A mostly pointless blog post to help me feel better about not putting anything up for a while.  Maybe something more substantive next time...


Wednesday, March 28, 2012

tailed 454 amplicon sequencing fail

Our lab has been sending samples off to a lab out of state for 454 sequencing of amplicon pools for almost a year now.  We have so far just used this to assess fungal diversity (ITS region), but in theory could come up with all sorts of applications.  Other labs at NAU have been doing the same thing, but look at various targets to assess various questions of diversity (bacterial, archaeal, fungal, functional gene diversity etc).  Another tool I routinely use is tailed fluorescent primers for labeling microsatellites I amplify from my field studies (seeSchuelke, M. (2000). An economic method for the fluorescent labeling of PCR fragments. Nature Biotechnology, 18, 233-234).  Not too long ago, I was thinking tailed primers would be useful for amplifying samples for 454 samples, but it is very expensive to just test this idea out.  Instead, I used it successfully for doing tRFLPs for QC prior to sending samples off for 454 sequencing.  Later I realized this was discussed in the GS Junior Guide for Experimental Design, though some details and crucially, a discussion of the data produced, was missing.

So I was excited to see a couple of recent publications:

Bybee, S. M., Bracken-Grissom, H., Haynes, B. D., Hermansen, R. a, Byers, R. L., Clement, M. J., Udall, J. a, et al. (2011). Targeted amplicon sequencing (TAS): A scalable next-gen approach to multi-locus, multi-taxa phylogenetics. Genome biology and evolution, 3, 1312-1323.

Daigle, D., Simen, B., & Pochart, P. (2011). High-Throughput Sequencing of PCR Products Tagged with Universal Primers Using 454 Life Sciences Systems. Current Protocols in Molecular Biology, 96, 7.5.1-7.5.14.

The authors carefully go through the steps involved in using tailed primers and detail their results.  So in order to maximize the monetary power of all labs doing this here at NAU, I proposed to try using a set of tailed primer to barcode a sample for 454 amplicon sequencing.  I did not spend a lot of time, but things did not go as smoothly as I had hoped.

Here are the primers I used:
M13f-ITS1F 5'-CGCCAGGGTTTTCCCAGTCACGACCTTGGTCATTTAGAGGAAGTAA-3'
M13r-ITS4 5'-TCACACAGGAAACAGCTATGACTCCTCCGCTTATTGATATGC-3'
LibAf-MID01-M13f 5'-CGTATCGCCTCCCTCGCGCCATCAGACGAGTGCGTCGCCAGGGTTTTCCCAGTCACGAC-3'
LibAr-MID01-M13r
5'-CTATGCGCCTTGCCAGCCCGCTCAGACGAGTGCGTTCACACAGGAAACAGCTATGAC-3'
LibAf 5'-CGTATCGCCTCCCTCGCGCCATCAG-3'
LibAr 5'-CTATGCGCCTTGCCAGCCCGCTCAG-3'

The M13-ITS primers have the ITS primer 3' and the M13 sequence 5'.  The LibA-MID-M13 primers have LibA sequence 5', MID01 in the middle, and M13 sequence 3'.

First amplification (20uL reactions, 0.01U/uL Phusion polymerase, MgCl2 at 2.5mM, M13-ITS primers at 200nM; 30 cycles of 90C 30s, 57C 30s, 72C 1min)


There are 4 samples in the gel (2uL/well).  First four wells are each sample at 1X DNA concentration, 2nd four DNA at 1/20X.  Ladder is (bp) 2000, 800, 400, 200, 100.  Note the significant artifact around 100bp.

I column purified the 1/20X products with Epoch Life Science columns (http://www.epochlifescience.com/Product/SpinColumn/minispin.aspx) and quantified with OD260.  All purified products were around 40 ng/uL.

Here is a gel of the column purified products (10uL/well):

So I co-purified my 100bp artifact, but I continued nonetheless.  To try to reduce the amplification of the non-specific product, I added 2.5uL each purified product to 1000uL 10mM Tris-Cl pH 8.8.  I amplified under same conditions both the raw column purified product and the diluted product using LibA-MID01-M13 primers this time:

First 4 samples are using full-strength column purified DNA, second 4 samples are using the dilution.  The ITS product looks really weak now, and the 100bp artifact has dominated my PCR.  No bueno!

I reran the raw M13-ITS products on a gel and cut out the ITS bands this time.  I was generous in the size of gel slices in case there were products I couldn't see.  Briefly, I put the gel slice in a 1.5mL tube, added 3 2mm steel beads, and beat the gel in the GenoGrinder for 15s at about 28hz.  I spun the gel down briefly, added 200uL Tris-Cl, vortexed, and placed in 65C block for 5 min.  I spun 1 min at 15,000rpm, and pipeted off 100uL of supernatant.  Gel-purified products extruded!

I then redid the LibA tagging using the LibA-MID01-M13 primers on my gel-purified samples (full strength only):

Each 2 wells are 8uL column purified M13-ITS product, then 2uL LibA-tagged product from the last PCR round.  You can see the gel purification made a huge difference in maintaining specificity.  You can also see the subtle gel shift as the LibA tagging incorporates another 50bp or so.  Still, some unwated product remains, particularly now at about 180bp.

So I thought I could change PCR conditions and improve things.  I cut out the excess MgCl2 so that it was back to 1.5mM and used the following PCR conditions: 30 cycles of 90C 30s, 63C 2 min, 72C 15s.  Intial PCR (M13-ITS primers) yielded OK results:

The top comb is from another project, but the bottom comb are the same four samples in a PCR serial dilution (2uL/well).  First 4 samples 1X DNA; 2nd 4, 1/10X DNA; 3rd 4, 1/100X DNA.  I thought perhaps the column cleanup was unnecessary and perhaps carryover primer was causing my 180bp non-specific product in the LibA tailing step so I thought I would use Exonuclease I to clean up the samples this time (just the 1X and 1/10X products).

Using the same altered PCR conditions, I did the LibA tailing step:

(First 4 wells, 1X DNA template, 2nd 4 wells, 1/10X template, 9th well, no DNA template).  That pesky 180bp artifact is still present, so ExoI didn't help in this case.  That, plus the negative control this time mean that the artifact is definitely double-stranded primer-dimer of some sort.  Now I reran the first PCR of this series (M13-ITS primers) on a gel and did another gel extraction (1X and 1/10X products):

You can see the 180bp product is pretty strong still and the ITS product is present, but weak.  I attribute this to the fact that the initial ITS product with this method was pretty weak.  I think it could be partially resolved by adding a few more cycles to the PCR and then doing the gel extraction.  Having your product of interest in molar excess seems crucial to this process.

Chatting with a colleague, he was bothered by the number of PCR reactions needed to generate a 454 amplicon library this way.  I agreed that it would unduly bias your pool so that relative abundances would be almost meaningless and then I thought of another solution.  Put all primers in the same mix, and just do a single amplification!  I didn't have time to redo the initial amplification, even though I probably should have, but I thought I could use the last gel-purified products which used the M13-ITS primers:

These are products just from the 1/10X gel-purified products as template (2uL/well).  I included LibA primers (no tail) at 200nM each as drivers of the desired products along with LibA-MID01-M13 primers at 20nM each to get the reaction going.  As you can see, the 180bp product is quite present, but the ITS products came out much better this time.  Perhaps using a stronger initial M13-ITS product or even including M13-ITS in the whole mix would yield a cleaner amplicon pool.  For now I am out of time for this, and these products are not sequenceable.  We will order a set of barcoded primers specific to the loci we are interested in a gasket our 454 plates until something better comes along or we solve the non-specific product issue.  The Bybee et al (2011) reference stated that such non-specific products used up a third of the reads on their plates.  We can't have that!