dask: UnpicklingError: invalid load key - with Large Datasets

Hello All,

I hope you are having a nice day.

I am consistently having the following issue. Note that the load key changes, but is always some obscure char.

-bash-4.1$ python2.7 cat_predict2.py
Traceback (most recent call last):
  File "cat_predict2.py", line 83, in <module>
    df.to_csv(output_file, index = True, sep = '\t')
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/dataframe/core.py", line 1105, in to_csv
    return to_csv(self, filename, **kwargs)
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/dataframe/io/csv.py", line 636, in to_csv
    delayed(values).compute(get=get, scheduler=scheduler)
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/base.py", line 156, in compute
    (result,) = compute(self, traverse=False, **kwargs)
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/base.py", line 395, in compute
    results = schedule(dsk, keys, **kwargs)
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/threaded.py", line 75, in get
    pack_exception=pack_exception, **kwargs)
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/local.py", line 501, in get_async
    raise_exception(exc, tb)
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/local.py", line 272, in execute_task
    result = _execute_task(task, data)
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/local.py", line 253, in _execute_task
    return func(*args2)
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/optimization.py", line 942, in __call__
    dict(zip(self.inkeys, args)))
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/core.py", line 132, in _get_recursive
    res = cache[x] = _get_recursive(dsk, dsk[x], cache)
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/core.py", line 136, in _get_recursive
    args2 = [_get_recursive(dsk, k, cache) for k in args]
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/core.py", line 126, in _get_recursive
    return [_get_recursive(dsk, k, cache) for k in x]
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/core.py", line 132, in _get_recursive
    res = cache[x] = _get_recursive(dsk, dsk[x], cache)
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/core.py", line 136, in _get_recursive
    args2 = [_get_recursive(dsk, k, cache) for k in args]
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/core.py", line 126, in _get_recursive
    return [_get_recursive(dsk, k, cache) for k in x]
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/core.py", line 136, in _get_recursive
    args2 = [_get_recursive(dsk, k, cache) for k in args]
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/core.py", line 126, in _get_recursive
    return [_get_recursive(dsk, k, cache) for k in x]
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/core.py", line 137, in _get_recursive
    return func(*args2)
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/dask/dataframe/shuffle.py", line 419, in collect
    res = p.get(part)
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/partd/core.py", line 73, in get
    return self.get([keys], **kwargs)[0]
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/partd/core.py", line 79, in get
    return self._get(keys, **kwargs)
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/partd/encode.py", line 30, in _get
    for chunk in raw]
  File "/opt/rh/python27/root/usr/lib/python2.7/site-packages/partd/pandas.py", line 170, in deserialize
    headers = pickle.loads(frames[0])
cPickle.UnpicklingError: invalid load key, 'Ë'.

My script is very simple and works just fine with smaller data sets, currently I am trying to merge 9 files that are each between 100 million and 500 million rows with 2-3 columns each.

Note this is on a Linux server with 90GBs of ram and 2.3TB storage.

The code I am running is simply:

import dask.dataframe as dd
import pandas as pd
from dask.distributed import Client
import os
import dask
from timeit import default_timer as timer

path = r'/location'
files = os.listdir(path)
files = list(filter(None, [path + '/' + x if x.startswith('val') else None for x in files]))

df =  dd.from_pandas(pd.DataFrame(columns = ['delete']),npartitions = 1)
for file in files:
    new = dd.read_csv(file)
    new['iid'] = new['id'].astype(int)
    new = new.set_index('id')
    df = df.merge(new, left_index = True, right_index = True, how = 'outer')

start = timer()

output_file = 'master_*.csv'
df.to_csv(output_file, index = True, sep = '\t')

end = timer()
print("Time Taken to write: ", end-start)

Thanks so much for your help!

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Comments: 18 (9 by maintainers)

Most upvoted comments

@mrocklin the attached two scripts should reproduce the issue: Archive.zip

Just run:

./dask_tbl_gen.py -n 2000000 > TEST.csv
./merge_tables_wDask.py -p 12 TEST.csv TEST.csv > TEST1-1.csv

You may need to run the random data generator a couple of times before you generate a file that produces the error. I was able to reproduce the same _pickle.UnpicklingError: invalid load key, '\xdd'. error multiple times in a row with the differing runs of ./dask_tbl_gen.py -n 2000000, but maybe sometimes the random data will not generate the error.

My conda env info is listed above.

I’m running into the same error:

Traceback (most recent call last):
  File "../bin/scripts/merge_tables_wDask.py", line 108, in <module>
    main(args)
  File "../bin/scripts/merge_tables_wDask.py", line 80, in main
    header=args.header)
  File "../bin/scripts/merge_tables_wDask.py", line 50, in load_csv
    df = df.set_index('idx').compute(scheduler="multiprocessing", num_workers=threads)
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/base.py", line 156, in compute
    (result,) = compute(self, traverse=False, **kwargs)
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/base.py", line 402, in compute
    results = schedule(dsk, keys, **kwargs)
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/multiprocessing.py", line 177, in get
    raise_exception=reraise, **kwargs)
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/local.py", line 521, in get_async
    raise_exception(exc, tb)
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/compatibility.py", line 68, in reraise
    raise exc.with_traceback(tb)
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/local.py", line 290, in execute_task
    result = _execute_task(task, data)
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/local.py", line 270, in _execute_task
    args2 = [_execute_task(a, cache) for a in args]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/local.py", line 270, in <listcomp>
    args2 = [_execute_task(a, cache) for a in args]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/local.py", line 267, in _execute_task
    return [_execute_task(a, cache) for a in arg]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/local.py", line 267, in <listcomp>
    return [_execute_task(a, cache) for a in arg]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/local.py", line 270, in _execute_task
    args2 = [_execute_task(a, cache) for a in args]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/local.py", line 270, in <listcomp>
    args2 = [_execute_task(a, cache) for a in args]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/local.py", line 267, in _execute_task
    return [_execute_task(a, cache) for a in arg]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/local.py", line 267, in <listcomp>
    return [_execute_task(a, cache) for a in arg]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/local.py", line 271, in _execute_task
    return func(*args2)
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/dask/dataframe/shuffle.py", line 414, in collect
    res = p.get(part)
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/partd/core.py", line 73, in get
    return self.get([keys], **kwargs)[0]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/partd/core.py", line 79, in get
    return self._get(keys, **kwargs)
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/partd/encode.py", line 30, in _get
    for chunk in raw]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/partd/encode.py", line 30, in <listcomp>
    for chunk in raw]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/partd/encode.py", line 29, in <listcomp>
    return [self.join([self.decode(frame) for frame in framesplit(chunk)])
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/partd/pandas.py", line 175, in deserialize
    for (h, b) in zip(headers[2:], bytes[2:])]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/partd/pandas.py", line 175, in <listcomp>
    for (h, b) in zip(headers[2:], bytes[2:])]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/partd/pandas.py", line 136, in block_from_header_bytes
    copy=True).reshape(shape)
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/partd/numpy.py", line 115, in deserialize
    blocks = [pickle.loads(f) for f in framesplit(bytes)]
  File "/ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746/lib/python3.7/site-packages/partd/numpy.py", line 115, in <listcomp>
    blocks = [pickle.loads(f) for f in framesplit(bytes)]
_pickle.UnpicklingError: invalid load key, '\xdd'.

…and after some trouble shooting, I can see why it is so hard to create a reproducible example. I tested this by trimming the input csv files down with head. In this case, I just joined 2 tables with dask. The relevant section of my python script:

   df = load_csv(args.input_table[0],
                  sep=args.input_sep,
                  threads=args.processes,
                  keep_path=args.keep_path,
                  header=args.header)
    # merging
    for i in range(1,len(args.input_table)):
        logging.info('Joining with: {}'.format(args.input_table[i]))
        df = df.join(load_csv(args.input_table[i],
                              sep=args.input_sep,
                              threads=args.processes,
                              keep_path=args.keep_path,
                              header=args.header),
                     how='outer',
                     on=args.join_on_key,
                     lsuffix='_XXX',
                     rsuffix='_YYY')

The error actually occurred while loading the first csv file, so the 2nd didn’t matter. I could get rid of the error by trimming the file to the first 90000 lines with head, but the error occurred if I trimmed to the first 1000000 lines. I further narrowed it down:

# number of lines; error/no-error
907000 -> no error
907500 -> no error
907800 -> no error
907850 -> no error
907900 -> error
908000 -> error

Lines 907850 --> 907900 like the following:

ST-J00101_75_HLCTNBBXX_2_1117_13951_30239_Orf_2519017_0_150_1_1_,14
ST-J00101_75_HLCTNBBXX_2_1104_32238_7486_Orf_807154_257_18446744073709551615_1_1_,9
ST-J00101_75_HLCTNBBXX_2_2223_30624_47489_Orf_1411382_149_18446744073709551615_1_1_,6
ST-J00101_72_HJ5JVBBXX_4_1115_20811_41000_Orf_1810053_0_150_1_1_,4
ST-J00101_75_HLCTNBBXX_2_2215_28250_35778_Orf_1917294_1_226_1_1_,39
ST-J00101_75_HLCTNBBXX_2_2119_9607_10475_Orf_2296838_0_240_1_1_,41
ST-J00101_75_HLCTNBBXX_2_1206_5761_37818_Orf_1154098_0_147_1_1_,19
ST-J00101_75_HLCTNBBXX_2_1102_15240_17843_Orf_1701223_0_150_1_1_,10
ST-J00101_75_HLCTNBBXX_2_1213_4310_24718_Orf_2150625_2_251_1_1_,40
ST-J00101_72_HJ5JVBBXX_4_2211_6979_32226_Orf_2784885_1_208_1_1_,12
ST-J00101_75_HLCTNBBXX_2_2119_6411_40491_Orf_2312878_149_18446744073709551615_1_1_,21
ST-J00101_72_HJ5JVBBXX_5_1204_22627_39963_Orf_1708665_149_18446744073709551615_1_1_,5
ST-J00101_72_HJ5JVBBXX_4_2218_13514_14537_Orf_645702_0_273_1_1_,8
ST-J00101_72_HJ5JVBBXX_2_2110_5883_38240_Orf_2438498_0_147_1_1_,2
ST-J00101_72_HJ5JVBBXX_7_2202_7293_42689_Orf_2893594_279_0_1_1_,3
ST-J00101_72_HJ5JVBBXX_3_1112_16589_48948_Orf_1212582_147_0_1_1_,5
ST-J00101_72_HJ5JVBBXX_2_1106_11891_43937_Orf_1473707_0_150_1_1_,3
ST-J00101_75_HLCTNBBXX_3_1206_9333_13658_Orf_663131_147_0_1_1_,3
ST-J00101_70_HKTWNBBXX_2_2201_10561_31154_Orf_2648178_287_18446744073709551615_1_1_,5
ST-J00101_70_HKTWNBBXX_2_2224_3681_16524_Orf_764600_149_18446744073709551615_1_1_,4
ST-J00101_75_HLCTNBBXX_3_2210_10348_25544_Orf_3022785_270_0_1_1_,12
ST-J00101_72_HJ5JVBBXX_3_2113_20395_37888_Orf_692352_213_0_1_1_,6
ST-J00101_75_HLCTNBBXX_3_2116_11830_47102_Orf_898993_148_1_1_1_,4
ST-J00101_70_HKTWNBBXX_3_1206_4665_23716_Orf_318083_147_0_1_1_,5
ST-J00101_75_HLCTNBBXX_2_2209_6522_26371_Orf_2320035_1_259_1_1_,5
ST-J00101_75_HLCTNBBXX_2_1204_19076_8031_Orf_1607975_2_227_1_1_,6
ST-J00101_72_HJ5JVBBXX_6_1216_16133_45133_Orf_2368737_0_150_1_1_,6
ST-J00101_70_HKTWNBBXX_3_1205_2148_30749_Orf_2016687_271_1_1_1_,4
ST-J00101_70_HKTWNBBXX_4_2204_29061_2510_Orf_312895_2_263_1_1_,14
ST-J00101_70_HKTWNBBXX_4_2103_19908_41053_Orf_2546747_1_226_1_1_,12
ST-J00101_72_HJ5JVBBXX_8_1122_5335_21606_Orf_2293574_2_224_1_1_,13
ST-J00101_72_HJ5JVBBXX_8_1127_8937_10405_Orf_1899169_274_1_1_1_,18
ST-J00101_72_HJ5JVBBXX_8_2123_12236_44781_Orf_2089023_149_18446744073709551615_1_1_,14
ST-J00101_70_HKTWNBBXX_4_2206_25428_23329_Orf_2375944_215_18446744073709551615_1_1_,15
ST-J00101_72_HJ5JVBBXX_6_1216_16133_45133_Orf_2368738_2_149_1_1_,4
ST-J00101_75_HLCTNBBXX_2_1224_5720_38416_Orf_187414_147_0_1_1_,2
ST-J00101_75_HLCTNBBXX_2_2224_27194_12181_Orf_2461494_0_147_1_1_,21
ST-J00101_72_HJ5JVBBXX_8_2208_18233_18669_Orf_860868_148_1_1_1_,6
ST-J00101_70_HKTWNBBXX_3_2222_31314_11864_Orf_123777_147_0_1_1_,16
ST-J00101_70_HKTWNBBXX_3_1206_11728_5112_Orf_2205352_148_1_1_1_,11
ST-J00101_72_HJ5JVBBXX_6_1116_12297_29835_Orf_1785985_149_18446744073709551615_1_1_,17
ST-J00101_70_HKTWNBBXX_3_2228_23835_20709_Orf_1509741_149_18446744073709551615_1_1_,23
ST-J00101_72_HJ5JVBBXX_4_2220_23815_10475_Orf_1467076_0_198_1_1_,7
ST-J00101_72_HJ5JVBBXX_5_2112_24789_10510_Orf_1506055_2_242_1_1_,17
ST-J00101_75_HLCTNBBXX_2_1220_15067_36605_Orf_178249_2_272_1_1_,12
ST-J00101_75_HLCTNBBXX_2_2109_27915_42583_Orf_645322_234_0_1_1_,35
ST-J00101_70_HKTWNBBXX_3_1223_8704_34160_Orf_2659593_0_150_1_1_,5
ST-J00101_72_HJ5JVBBXX_5_1226_19735_45010_Orf_480009_0_150_1_1_,5
ST-J00101_75_HLCTNBBXX_2_1203_30472_33123_Orf_1017622_147_0_1_1_,27
ST-J00101_72_HJ5JVBBXX_5_2205_5680_4690_Orf_57319_0_264_1_1_,12

I don’t see anything wrong. I tried to just use this section of the file as input, but there was no error. I tried using this section, plus 10000 or 100000 lines prior, but no error. I tried just removing the header of the csv (head head -n 907900 FILE.csv| tail -n +2 > OUTPUT.csv), and then there was an error. I tried removing the first 50 lines, and then no error.

So, it appears that the error results from having a certain number of lines in the csv (ie., > ~907850).

I also tried searching for the load key (‘\xdd’) but could not find it in my csv file.

Conda env:

# Name                    Version                   Build  Channel
blas                      1.0                         mkl
bokeh                     1.0.4                 py37_1000    conda-forge
bzip2                     1.0.6             h14c3975_1002    conda-forge
ca-certificates           2018.11.29           ha4d7672_0    conda-forge
certifi                   2018.11.29            py37_1000    conda-forge
click                     7.0                        py_0    conda-forge
cloudpickle               0.6.1                      py_0    conda-forge
cytoolz                   0.9.0.1         py37h14c3975_1001    conda-forge
dask                      0.18.1                     py_0    conda-forge
dask-core                 0.18.1                     py_0    conda-forge
distributed               1.25.2                py37_1000    conda-forge
freetype                  2.9.1             h94bbf69_1005    conda-forge
heapdict                  1.0.0                 py37_1000    conda-forge
intel-openmp              2019.1                      144
jinja2                    2.10                       py_1    conda-forge
jpeg                      9c                h14c3975_1001    conda-forge
libffi                    3.2.1             hf484d3e_1005    conda-forge
libgcc-ng                 7.3.0                hdf63c60_0    conda-forge
libgfortran-ng            7.2.0                hdf63c60_3    conda-forge
libpng                    1.6.36            h84994c4_1000    conda-forge
libstdcxx-ng              7.3.0                hdf63c60_0    conda-forge
libtiff                   4.0.10            h9022e91_1002    conda-forge
locket                    0.2.0                      py_2    conda-forge
markupsafe                1.1.0           py37h14c3975_1000    conda-forge
mkl                       2019.1                      144
mkl_fft                   1.0.10           py37h14c3975_1    conda-forge
mkl_random                1.0.2            py37h637b7d7_2    conda-forge
msgpack-python            0.6.0           py37h6bb024c_1000    conda-forge
ncurses                   6.1               hf484d3e_1002    conda-forge
numpy                     1.15.4           py37h7e9f1db_0
numpy-base                1.15.4           py37hde5b4d6_0
olefile                   0.46                       py_0    conda-forge
openssl                   1.0.2p            h14c3975_1002    conda-forge
packaging                 19.0                       py_0    conda-forge
pandas                    0.24.0           py37hf484d3e_0    conda-forge
partd                     0.3.9                      py_0    conda-forge
pillow                    5.4.1           py37h00a061d_1000    conda-forge
pip                       18.1                  py37_1000    conda-forge
psutil                    5.4.8           py37h14c3975_1000    conda-forge
pyparsing                 2.3.1                      py_0    conda-forge
python                    3.7.1             hd21baee_1000    conda-forge
python-dateutil           2.7.5                      py_0    conda-forge
pytz                      2018.9                     py_0    conda-forge
pyyaml                    3.13            py37h14c3975_1001    conda-forge
readline                  7.0               hf8c457e_1001    conda-forge
setuptools                40.6.3                   py37_0    conda-forge
six                       1.12.0                py37_1000    conda-forge
sortedcontainers          2.1.0                      py_0    conda-forge
sqlite                    3.26.0            h67949de_1000    conda-forge
tblib                     1.3.2                      py_1    conda-forge
tk                        8.6.9             h84994c4_1000    conda-forge
toolz                     0.9.0                      py_1    conda-forge
tornado                   5.1.1           py37h14c3975_1000    conda-forge
wheel                     0.32.3                   py37_0    conda-forge
xz                        5.2.4             h14c3975_1001    conda-forge
yaml                      0.1.7             h14c3975_1001    conda-forge
zict                      0.1.3                      py_0    conda-forge
zlib                      1.2.11            h14c3975_1004    conda-forge
zstd                      1.3.3                         1    conda-forge

conda info:

     active environment : /ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746
    active env location : /ebio/abt3_projects/software/dev/llmgag/.snakemake/conda/832db746
            shell level : 3
       user config file : /ebio/abt3/nyoungblut/.condarc
 populated config files : /ebio/abt3_projects/software/dev/miniconda3_dev/.condarc
                          /ebio/abt3/nyoungblut/.condarc
          conda version : 4.6.1
    conda-build version : 3.11.0
         python version : 3.6.5.final.0
       base environment : /ebio/abt3_projects/software/dev/miniconda3_dev  (writable)
           channel URLs : https://conda.anaconda.org/conda-forge/linux-64
                          https://conda.anaconda.org/conda-forge/noarch
                          https://conda.anaconda.org/bioconda/linux-64
                          https://conda.anaconda.org/bioconda/noarch
                          https://repo.anaconda.com/pkgs/main/linux-64
                          https://repo.anaconda.com/pkgs/main/noarch
                          https://repo.anaconda.com/pkgs/free/linux-64
                          https://repo.anaconda.com/pkgs/free/noarch
                          https://repo.anaconda.com/pkgs/r/linux-64
                          https://repo.anaconda.com/pkgs/r/noarch
                          https://conda.anaconda.org/leylabmpi/linux-64
                          https://conda.anaconda.org/leylabmpi/noarch
                          https://conda.anaconda.org/r/linux-64
                          https://conda.anaconda.org/r/noarch
                          https://conda.anaconda.org/qiime2/linux-64
                          https://conda.anaconda.org/qiime2/noarch
          package cache : /ebio/abt3_projects/software/dev/miniconda3_dev/pkgs
                          /ebio/abt3/nyoungblut/.conda/pkgs
       envs directories : /ebio/abt3_projects/software/dev/miniconda3_dev/envs
                          /ebio/abt3/nyoungblut/.conda/envs
               platform : linux-64
             user-agent : conda/4.6.1 requests/2.18.4 CPython/3.6.5 Linux/4.9.127 ubuntu/18.04.1 glibc/2.27
                UID:GID : 6354:350
             netrc file : None
           offline mode : False