C++ library and cmdline tools for parsing and manipulating VCF files

Overview

vcflib

A C++ library for parsing and manipulating VCF files.

Github-CI Travis-CI AnacondaBadge DL BrewBadge GuixBadge DebianBadge C++0x Gitter

overview

The Variant Call Format (VCF) is a flat-file, tab-delimited textual format that describes reference-indexed variations between individuals. VCF provides a common interchange format for the description of variation in individuals and populations of samples, and has become the de facto standard reporting format for a wide array of genomic variant detectors.

vcflib provides methods to manipulate and interpret sequence variation described by VCF. It is both:

  • an API for parsing and operating on records of genomic variation as it can be described by the VCF format
  • a collection of command-line utilities for executing complex manipulations on VCF files

vclib is both a library (with an API) and a collection of useful tools. The API provides a quick and extremely permissive method to read and write VCF files. Extensions and applications of the library provided in the included utilities (*.cpp) comprise the vast bulk of the library's utility.


Short index:


INSTALL

Bioconda

Conda installs in user land without root access

conda install -c bioconda vcflib

Homebrew

Homebrew installs on Linux and Mac OSX

brew install brewsci/bio/vcflib

Debian

For Debian and Ubuntu

apt-get install libvcflib-tools libvcflib-dev

GNU Guix

We develop against guix

guix package -i vcflib

USAGE

Users are encouraged to drive the utilities in the library in a streaming fashion, using Unix pipes to fully utilize resources on multi-core systems. Piping provides a convenient method to interface with other libraries (vcf-tools, BedTools, GATK, htslib, bio-vcf, bcftools, freebayes) which interface via VCF files, allowing the composition of an immense variety of processing functions. Examples can be found in the scripts, e.g. script.

TOOLS

filter

filter command description
vcfuniq List unique genotypes. Like GNU uniq, but for VCF records. Remove records which have the same position, ref, and alt as the previous record.
vcfuniqalleles List unique alleles For each record, remove any duplicate alternate alleles that may have resulted from merging separate VCF files.
vcffilter VCF filter the specified vcf file using the set of filters

metrics

metrics command description
vcfcheck Validate integrity and identity of the VCF by verifying that the VCF record's REF matches a given reference file.
vcfhethomratio Generates the het/hom ratio for each individual in the file
vcfhetcount Calculate the heterozygosity rate: count the number of alternate alleles in heterozygous genotypes in all records in the vcf file
vcfdistance Adds a tag to each variant record which indicates the distance to the nearest variant. (defaults to BasesToClosestVariant if no custom tag name is given.
vcfentropy Annotate VCF records with the Shannon entropy of flanking sequence. Anotates the output VCF file with, for each record, EntropyLeft, EntropyRight, EntropyCenter, which are the entropies of the sequence of the given window size to the left, right, and center of the record. Also adds EntropyRef and EntropyAlt for each alt.

phenotype

phenotype command description
permuteGPAT++ permuteGPAT++ is a method for adding empirical p-values to a GPAT++ score.

genotype

genotype command description
normalize-iHS normalizes iHS or XP-EHH scores.
hapLrt HapLRT is a likelihood ratio test for haplotype lengths. The lengths are modeled with an exponential distribution. The sign denotes if the target has longer haplotypes (1) or the background (-1).
abba-baba abba-baba calculates the tree pattern for four indviduals. This tool assumes reference is ancestral and ignores non abba-baba sites. The output is a boolian value: 1 = true , 0 = false for abba and baba. the tree argument should be specified from the most basal taxa to the most derived.

transformation

transformation command description
vcfinfo2qual Sets QUAL from info field tag keyed by [key]. The VCF file may be omitted and read from stdin. The average of the field is used if it contains multiple values.
vcfsamplediff Establish putative somatic variants using reported differences between germline and somatic samples. Tags each record where the listed sample genotypes differ with . The first sample is assumed to be germline, the second somatic. Each record is tagged with ={germline,somatic,loh} to specify the type of variant given the genotype difference between the two samples.
vcfaddinfo Adds info fields from the second file which are not present in the first vcf file.
vcfremoveaberrantgenotypes strips samples which are homozygous but have observations implying heterozygosity. Remove samples for which the reported genotype (GT) and observation counts disagree (AO, RO).
vcfglxgt Set genotypes using the maximum genotype likelihood for each sample.
dumpContigsFromHeader Dump contigs from header
vcfevenregions Generates a list of regions, e.g. chr20:10..30 using the variant density information provided in the VCF file to ensure that the regions have even numbers of variants. This can be use to reduce the variance in runtime when dividing variant detection or genotyping by genomic coordinates.
vcfcat Concatenates VCF files
vcfannotategenotypes Examine genotype correspondence. Annotate genotypes in the first file with genotypes in the second adding the genotype as another flag to each sample filed in the first file. annotation-tag is the name of the sample flag which is added to store the annotation. also adds a 'has_variant' flag for sites where the second file has a variant.
vcfafpath Display genotype paths
vcfclassify Creates a new VCF where each variant is tagged by allele class: snp, ts/tv, indel, mnp
vcfallelicprimitives If multiple allelic primitives (gaps or mismatches) are specified in a single VCF record, split the record into multiple lines, but drop all INFO fields. Does not handle genotypes (yet). MNPs are split into multiple SNPs unless the -m flag is provided. Records generated by splits have th
vcfqual2info Puts QUAL into an info field tag keyed by [key].
vcfcreatemulti If overlapping alleles are represented across multiple records, merge them into a single record. Currently only for indels.
vcfgeno2alleles modifies the genotypes field to provide the literal alleles rather than indexes
vcfsample2info Take annotations given in the per-sample fields and add the mean, median, min, or max to the site-level INFO.
vcfld Compute LD
vcfnumalt outputs a VCF stream where NUMALT has been generated for each record using sample genotypes
vcfstreamsort Sorts the input (either stdin or file) using a streaming sort algorithm. Guarantees that the positional order is correct provided out-of-order variants are no more than 100 positions in the VCF file apart.
vcfinfosummarize Take annotations given in the per-sample fields and add the mean, median, min, or max to the site-level INFO.
vcflength Add length info field
vcfkeepgeno Reduce file size by removing FORMAT fields not listed on the command line from sample specifications in the output
vcfcombine Combine VCF files positionally, combining samples when sites and alleles are identical. Any number of VCF files may be combined. The INFO field and other columns are taken from one of the files which are combined when records in multiple files match. Alleles must have identical ordering to be combined into one record. If they do not, multiple records will be emitted.
vcfprimers For each VCF record, extract the flanking sequences, and write them to stdout as FASTA records suitable for alignment.
vcfflatten Removes multi-allelic sites by picking the most common alternate. Requires allele frequency specification 'AF' and use of 'G' and 'A' to specify the fields which vary according to the Allele or Genotype. VCF file may be specified on the command line or piped as stdin.
vcf2dag Modify VCF to be able to build a directed acyclic graph (DAG)
vcfcleancomplex Removes reference-matching sequence from complex alleles and adjusts records to reflect positional change.
vcfbreakmulti If multiple alleles are specified in a single record, break the record into multiple lines, preserving allele-specific INFO fields.
vcfindex Adds an index number to the INFO field (id=position)
vcfkeepinfo To decrease file size remove INFO fields not listed on the command line
vcfgeno2haplo Convert genotype-based phased alleles within --window-size into haplotype alleles. Will break haplotype construction when encountering non-phased genotypes on input.
vcfintersect VCF set analysis
vcfannotate Intersect the records in the VCF file with targets provided in a BED file. Intersections are done on the reference sequences in the VCF file. If no VCF filename is specified on the command line (last argument) the VCF read from stdin.
smoother smoothes is a method for window smoothing many of the GPAT++ formats.
vcf2fasta Generates sample_seq:N.fa for each sample, reference sequence, and chromosomal copy N in [0,1... ploidy]. Each sequence in the fasta file is named using the same pattern used for the file name, allowing them to be combined.
vcfsamplenames List sample names
vcfleftalign Left-align indels and complex variants in the input using a pairwise ref/alt alignment followed by a heuristic, iterative left realignment process that shifts indel representations to their absolute leftmost (5') extent.
vcfglbound Adjust GLs so that the maximum GL is 0 by dividing all GLs for each sample by the max.
vcfcommonsamples Generates each record in the first file, removing samples not present in the second
vcfecho Echo VCF to stdout (simple demo)
vcfkeepsamples outputs each record in the vcf file, removing samples not listed on the command line
vcf2tsv Converts VCF to per-allelle or per-genotype tab-delimited format, using null string to replace empty values in the table. Specifying -g will output one line per sample with genotype information. When there is more than one alt allele there will be multiple rows, one for each allele and, the info will match the 'A' index
vcfoverlay Overlay records in the input vcf files with order as precedence.
vcfgenosamplenames Get samplenames
vcfremovesamples outputs each record in the vcf file, removing samples listed on the command line
vcfremap For each alternate allele, attempt to realign against the reference with lowered gap open penalty. If realignment is possible, adjust the cigar and reference/alternate alleles. Observe how different alignment parameters, including context and entropy-dependent ones, influence variant classification and interpretation.
vcffixup Generates a VCF stream where AC and NS have been generated for each record using sample genotypes

statistics

statistics command description
vcfgenosummarize Adds summary statistics to each record summarizing qualities reported in called genotypes. Uses: RO (reference observation count), QR (quality sum reference observations) AO (alternate observation count), QA (quality sum alternate observations)
vcfcountalleles Count alleles
meltEHH
genotypeSummary Generates a table of genotype counts. Summarizes genotype counts for bi-allelic SNVs and indel
vcfrandomsample Randomly sample sites from an input VCF file, which may be provided as stdin. Scale the sampling probability by the field specified in KEY. This may be used to provide uniform sampling across allele frequencies, for instance.
pVst pVst calculates vst, a measure of CNV stratification.
vcfrandom Generate a random VCF file
segmentFst segmentFst creates genomic segments (bed file) for regions with high wcFst
sequenceDiversity The sequenceDiversity program calculates two popular metrics of haplotype diversity: pi and extended haplotype homozygoisty (eHH). Pi is calculated using the Nei and Li 1979 formulation. eHH a convenient way to think about haplotype diversity. When eHH = 0 all haplotypes in the window are unique and when eHH = 1 all haplotypes in the window are identical.
segmentIhs Creates genomic segments (bed file) for regions with high wcFst
vcfgenotypes Report the genotypes for each sample, for each variant in the VCF. Convert the numerical represenation of genotypes provided by the GT field to a human-readable genotype format.
vcfaltcount count the number of alternate alleles in all records in the vcf file
plotHaps plotHaps provides the formatted output that can be used with 'bin/plotHaplotypes.R'.
vcfsitesummarize Summarize by site
vcfgenotypecompare adds statistics to the INFO field of the vcf file describing the amount of discrepancy between the genotypes (GT) in the vcf file and the genotypes reported in the . use this after vcfannotategenotypes to get correspondence statistics for two vcfs.
vcfstats Prints statistics about variants in the input VCF file.
wcFst wcFst is Weir & Cockerham's Fst for two populations. Negative values are VALID, they are sites which can be treated as zero Fst. For more information see Evolution, Vol. 38 N. 6 Nov 1984. Specifically wcFst uses equations 1,2,3,4.
permuteSmooth permuteSmooth is a method for adding empirical p-values smoothed wcFst scores.
bFst bFst is a Bayesian approach to Fst. Importantly bFst accounts for genotype uncertainty in the model using genotype likelihoods. For a more detailed description see: `A Bayesian approach to inferring population structure from dominant markers' by Holsinger et al. Molecular Ecology Vol 11, issue 7 2002. The likelihood function has been modified to use genotype likelihoods provided by variant callers. There are five free parameters estimated in the model: each subpopulation's allele frequency and Fis (fixation index, within each subpopulation), a free parameter for the total population's allele frequency, and Fst.
vcfroc Generates a pseudo-ROC curve using sensitivity and specificity estimated against a putative truth set. Thresholding is provided by successive QUAL cutoffs.
vcfparsealts Alternate allele parsing method. This method uses pairwise alignment of REF and ALTs to determine component allelic primitives for each alternate allele.
pFst pFst is a probabilistic approach for detecting differences in allele frequencies between two populations.
iHS iHS calculates the integrated haplotype score which measures the relative decay of extended haplotype homozygosity (EHH) for the reference and alternative alleles at a site (see: voight et al. 2006, Spiech & Hernandez 2014).
popStats General population genetic statistics for each SNP

See also vcflib.md.

scripts

The vcflib source repository contains a number of additional scripts. Click on the link to see the source code.

script description
vcfclearinfo clear INFO field
vcfqualfilter quality filter
vcfnulldotslashdot rewrite null genotypes to ./.
vcfprintaltdiscrepancy.r show ALT discrepancies in a table
vcfremovenonATGC remove non-nucleotides in REF or ALT
plotSmoothed.R smooth plot of wcFst, pFst or abba-baba
vcf_strip_extra_headers strip headers
plotHapLrt.R plot results of pFst
vcfbiallelic remove anything that is not biallelic
vcfsort sort VCF using shell script
vcfnosnps remove SNPs
vcfmultiwayscripts more multiway comparisons
vcfgtcompare.sh annotates records in the first file with genotypes and sites from the second
plotPfst.R plot pFst
vcfregionreduce_and_cut reduce, gzip, and tabix
plotBfst.R plot results of pFst
vcfnobiallelicsnps remove biallelic SNPs
vcfindels show INDELS
vcfmultiway multiway comparison
vcfregionreduce reduce VCFs using a BED File, gzip them up and create tabix index
vcfprintaltdiscrepancy.sh runner
vcfclearid clear ID field
vcfcomplex remove all SNPs but keep SVs
vcffirstheader show first header
plotXPEHH.R plot XPEHH
vcfregionreduce_pipe reduce, gzip and tabix in a pipe
vcfplotaltdiscrepancy.sh plot ALT discrepancy runner
vcfplottstv.sh runner
vcfnoindels remove INDELs
bgziptabix runs bgzip on the input and tabix indexes the result
plotHaplotypes.R plot results
vcfplotsitediscrepancy.r plot site discrepancy
vcfindelproximity show SNPs around an INDEL
bed2region convert VCF CHROM column in VCF file to region
vcfplotaltdiscrepancy.r plot ALT discrepancies
plot_roc.r plot ROC
vcfmultiallelic remove anything that is not multiallelic
vcfsnps show SNPs
vcfvarstats use fastahack to get stats
vcfregionreduce_uncompressed reduce, gzip and tabix
plotWCfst.R plot wcFst
vcf2bed.py transform VCF to BED file
vcfjoincalls overlay files using QUAL and GT from a second VCF
vcf2sqlite.py push VCF file into SQLite3 database using dbname

Development

build from source

VCFLIB uses the cmake build system, after a recursive checkout of the sources make the files in the ./build directory with:

git clone --recursive https://github.com/vcflib/vcflib.git
cd vcflib
mkdir -p build && cd build
cmake ..
cmake --build .
cmake --install .

and to run the tests

ctest --verbose

Executables are built into the ./build directory in the repository.

Build dependencies can be viewed in the Travis-CI and github-CI scripts (see badges above), as well as guix.scm used by us to create the build environment (for instructions see the header of guix.scm). Essentially:

  • C++ compiler
  • htslib
  • tabixpp

For include files add

  • libhts-dev
  • libtabixpp-dev
  • libtabixpp0

And for some of the VCF executables

  • python
  • perl

Using a different htslib

Check out htslib in tabixpp (recursively) and

cmake -DHTSLIB_LOCAL:STRING=./tabixpp/htslib/ ..
cmake --build .

link library

The standard build creates build/vcflib.a. Take a hint from the cmake file that builds all the vcflib tools.

source code

See vcfecho.cpp for basic usage. Variant.h and Variant.cpp describe methods available in the API. vcflib is incorporated into several projects, such as freebayes, which may provide a point of reference for prospective developers. Note vcflib contains submodules (git repositories) comprising some dependencies. A full Guix development environment we use is defined here.

adding tests

vcflib uses different test systems. The most important one is the doctest because it doubles as documentation. For an example see vcf2tsv.md which can be run from the command line with

cd test
python3 -m doctest -o NORMALIZE_WHITESPACE -o REPORT_UDIFF pytest/vcf2tsv.md
``

# Contributing

To contribute code to vcflib send a github pull request. We may ask
you to add a working test case as described in 'adding tests'.

# LICENSE

This software is distributed under the free software [MIT
LICENSE](./LICENSE).
Comments
  • Package 'htslib', required by 'virtual:world', not found

    Package 'htslib', required by 'virtual:world', not found

    Describe the bug

    I'm working on a Linux based HPC working with private data hence the environment is restrictive (no internet connection); therefore packages must be built from source.

    I get a missing htslib error when building from source which I thought was odd - isn't htslib bundled with the package?

    To Reproduce

    On laptop:

    Download zip folder from the repo's Releases page

    Note: right clicking the "Source code (zip)" link and "Copy Link Location" to clipboard provides the link below, but CLI command provided for simplicity.

    wget https://codeload.github.com/vcflib/vcflib/zip/v1.0.2
    
    

    Transfer zip to secure Linux environment, then:

    $ unzip https://github.com/vcflib/vcflib/archive/v1.0.2.zip
    $ cd vcflib-1.0.2
    $ mkdir -p build && cd build
    $ cmake ..
    -- The C compiler identification is GNU 8.3.1
    -- The CXX compiler identification is GNU 8.3.1
    -- Check for working C compiler: /usr/bin/cc
    -- Check for working C compiler: /usr/bin/cc -- works
    -- Detecting C compiler ABI info
    -- Detecting C compiler ABI info - done
    -- Detecting C compile features
    -- Detecting C compile features - done
    -- Check for working CXX compiler: /usr/bin/c++
    -- Check for working CXX compiler: /usr/bin/c++ -- works
    -- Detecting CXX compiler ABI info
    -- Detecting CXX compiler ABI info - done
    -- Detecting CXX compile features
    -- Detecting CXX compile features - done
    -- Found PkgConfig: /usr/bin/pkg-config (found version "1.4.2")
    -- Found BZip2: /usr/lib64/libbz2.so (found version "1.0.6")
    -- Looking for BZ2_bzCompressInit
    -- Looking for BZ2_bzCompressInit - found
    -- Looking for lzma_auto_decoder in /usr/lib64/liblzma.so
    -- Looking for lzma_auto_decoder in /usr/lib64/liblzma.so - found
    -- Looking for lzma_easy_encoder in /usr/lib64/liblzma.so
    -- Looking for lzma_easy_encoder in /usr/lib64/liblzma.so - found
    -- Looking for lzma_lzma_preset in /usr/lib64/liblzma.so
    -- Looking for lzma_lzma_preset in /usr/lib64/liblzma.so - found
    -- Found LibLZMA: /usr/include (found version "5.2.4")
    -- Found ZLIB: /usr/lib64/libz.so (found version "1.2.11")
    --
    
    -- Looking for pthread.h
    -- Looking for pthread.h - found
    -- Looking for pthread_create
    -- Looking for pthread_create - not found
    -- Looking for pthread_create in pthreads
    -- Looking for pthread_create in pthreads - not found
    -- Looking for pthread_create in pthread
    -- Looking for pthread_create in pthread - found
    -- Found Threads: TRUE
    -- Checking for module 'htslib'
    --   Package 'htslib', required by 'virtual:world', not found
    CMake Error at /usr/share/cmake/Modules/FindPkgConfig.cmake:418 (message):
      A required package was not found
    Call Stack (most recent call first):
      /usr/share/cmake/Modules/FindPkgConfig.cmake:585 (_pkg_check_modules_internal)
      CMakeLists.txt:33 (pkg_check_modules)
    
    
    -- Configuring incomplete, errors occurred!
    See also "/project/M-mtgraovac182840/tools/vcflib-1.0.2/build/CMakeFiles/CMakeOutput.log".
    See also "/project/M-mtgraovac182840/tools/vcflib-1.0.2/build/CMakeFiles/CMakeError.log".
    

    Desired outcome

    I would like the package to build using the bundled htslib, or, alternatively, to provide PATHs for an external installation of htslib I have on the HPC.

    Thank you

    compiling build 
    opened by moldach 34
  • Your vcf2tsv has a really annoying bug!!!

    Your vcf2tsv has a really annoying bug!!!

    You claim that your vcf2tsv outputs one record per allele rather than one output per SNP but you are actually messing up the output by not separating the per-allele records by a new-line character!!!! This is a really elementary mistake and really annoying because the output records are messed up and, contrary to expectations, they equal the number of input records! Could you please fix this bug and test your vcf2tsv code?

    Moreover, you are randomly changing the order of the input INFO fields in the output tsv file whereas it would be preferable to keep it the same.

    Here is an example concatenated output:

    1 17380465 rs138979875 G A 0 . . . . . . RCV00013 2258.2 2 Hereditary_cancer-predisposing_syndrome MedGen:SNOMED_CT C0027672:699346009 NC_000001.10:g.1 7380465G>T 1 single 0 . . . . . . SDHB:6390 . . . . . . . . . . . . . . . . . . . . 138979875 17380465 . . 1 . 0 . . . . SNV . 0x050060000a05040002100100 1 . 1341 17380465 rs138979875 G T 0 . . . . . . RCV000132258.2 2 Hereditary_cancer-predisposing_s yndrome MedGen:SNOMED_CT C0027672:699346009 NC_000001.10:g.17380465G>T 1 single 0 . . . . . . SDHB:6390 . . . . . . . . . . . . . . . . . . . . 138979875 17380465 . . 1 . 0 . . . . SNV . 0x050060000a050400021001 00 1 . 134

    opened by deniseduma 27
  • Core dump when combining VCFs

    Core dump when combining VCFs

    Hi there I thought i'd give vcfcombine another go, and on testing on small test VCfs it worked perfectly. However when I tried this (on a clean git clone --recursive), combining 2600 VCFs, I got this

    vcflib/bin/vcfcombine results/*vcf > outvcf terminate called after throwing an instance of 'std::out_of_range' what(): basic_string::substr Aborted (core dumped) bash-4.1$

    opened by iqbal-lab 21
  • Where did vcffirstheader, vcfsort etc scripts disappear to?

    Where did vcffirstheader, vcfsort etc scripts disappear to?

    It seems a bunch of helper scripts have disappeared from vcflib since it moved from ekg to vcflib.

    eg. vcffirstheader, vcfsort etc

    Is this deliberate or an accident?

    question not-resolved 
    opened by tseemann 20
  • Downloading and installing problem

    Downloading and installing problem

    Hello I am trying to download vcflib onto server. But i couldn't download it. I followed the instructions given, all I am able to get is index.html and 2 png files. Please help me to download it. Thanks in advance.

    opened by Swanthana 19
  • vcffilter -g fails ?

    vcffilter -g fails ?

    when i run
    '''shell zcat saw.vcf.gz | vcffilter -f " DP > 3 & MQ > 30" -g " ! ( GT = 1/1 ) " ''' the format column is replaced by a '.'

    not-resolved stale 
    opened by kongdeju 16
  • VCF filter bug? adding previous lines meta information.

    VCF filter bug? adding previous lines meta information.

    Hi,

    I have been using vcffilter to perform some filtering on my dataset.

    I have made the file smaller to give an example of the bug.

    If I run

    vcffilter -g "DP > 2" <file below>
    

    On this file.

    ##fileformat=VCFv4.1
    ##ALT=<ID=NON_REF,Description="Represents any possible alternative allele at this location">
    ##FILTER=<ID=LowQual,Description="Low quality">
    ##FORMAT=<ID=AD,Number=.,Type=Integer,Description="Allelic depths for the ref and alt alleles in the order listed">
    ##FORMAT=<ID=DP,Number=1,Type=Integer,Description="Approximate read depth (reads with MQ=255 or with bad mates are filtered)">
    ##FORMAT=<ID=GQ,Number=1,Type=Integer,Description="Genotype Quality">
    ##FORMAT=<ID=GT,Number=1,Type=String,Description="Genotype">
    ##FORMAT=<ID=MIN_DP,Number=1,Type=Integer,Description="Minimum DP observed within the GVCF block">
    ##FORMAT=<ID=PL,Number=G,Type=Integer,Description="Normalized, Phred-scaled likelihoods for genotypes as defined in the VCF specification">
    ##FORMAT=<ID=SB,Number=4,Type=Integer,Description="Per-sample component statistics which comprise the Fisher's Exact Test to detect strand bias.">
    ##GATKCommandLine=<ID=GenotypeGVCFs,Version=3.3-0-g37228af,Date="Mon Mar 23 17:02:25 NZDT 2015",Epoch=1427083345263,CommandLineOptions="analysis_type=GenotypeGVCFs input_file=[] showFullBamList=false read_buffer_size=null phone_home=AWS gatk_key=null tag=NA read_filter=[] intervals=null excludeIntervals=null interval_set_rule=UNION interval_merging=ALL interval_padding=0 reference_sequence=/Users/james/bioinformatics/ancient_dna_pipeline/ref/contamination.fa nonDeterministicRandomSeed=false disableDithering=false maxRuntime=-1 maxRuntimeUnits=MINUTES downsampling_type=BY_SAMPLE downsample_to_fraction=null downsample_to_coverage=1000 baq=OFF baqGapOpenPenalty=40.0 refactor_NDN_cigar_string=false fix_misencoded_quality_scores=false allow_potentially_misencoded_quality_scores=false useOriginalQualities=false defaultBaseQualities=-1 performanceLog=null BQSR=null quantize_quals=0 disable_indel_quals=false emit_original_quals=false preserve_qscores_less_than=6 globalQScorePrior=-1.0 validation_strictness=SILENT remove_program_records=false keep_program_records=false sample_rename_mapping_file=null unsafe=null disable_auto_index_creation_and_locking_when_reading_rods=false no_cmdline_in_header=false sites_only=false never_trim_vcf_format_field=false bcf=false bam_compression=null simplifyBAM=false disable_bam_indexing=false generate_md5=false num_threads=1 num_cpu_threads_per_data_thread=1 num_io_threads=0 monitorThreadEfficiency=false num_bam_file_handles=null read_group_black_list=null pedigree=[] pedigreeString=[] pedigreeValidationType=STRICT allow_intervals_with_unindexed_bam=false generateShadowBCF=false variant_index_type=DYNAMIC_SEEK variant_index_parameter=-1 logging_level=INFO log_to_file=null help=false version=false variant=[(RodBindingCollection [(RodBinding name=variant source=temp_results/MS10062.yes_collapse.final_contaminant_mapped.gvcf)]), (RodBindingCollection [(RodBinding name=variant2 source=temp_results/MS10066.yes_collapse.final_contaminant_mapped.gvcf)]), (RodBindingCollection [(RodBinding name=variant3 source=temp_results/MS10068.yes_collapse.final_contaminant_mapped.gvcf)]), (RodBindingCollection [(RodBinding name=variant4 source=temp_results/MS10069.yes_collapse.final_contaminant_mapped.gvcf)]), (RodBindingCollection [(RodBinding name=variant5 source=temp_results/MS10070.yes_collapse.final_contaminant_mapped.gvcf)]), (RodBindingCollection [(RodBinding name=variant6 source=temp_results/MS10129.yes_collapse.final_contaminant_mapped.gvcf)]), (RodBindingCollection [(RodBinding name=variant7 source=temp_results/MS10130.yes_collapse.final_contaminant_mapped.gvcf)]), (RodBindingCollection [(RodBinding name=variant8 source=temp_results/MS10131.yes_collapse.final_contaminant_mapped.gvcf)]), (RodBindingCollection [(RodBinding name=variant9 source=temp_results/MS10132.yes_collapse.final_contaminant_mapped.gvcf)]), (RodBindingCollection [(RodBinding name=variant10 source=temp_results/MS10133.yes_collapse.final_contaminant_mapped.gvcf)]), (RodBindingCollection [(RodBinding name=variant11 source=temp_results/MS10134.yes_collapse.final_contaminant_mapped.gvcf)]), (RodBindingCollection [(RodBinding name=variant12 source=temp_results/MS10135.yes_collapse.final_contaminant_mapped.gvcf)]), (RodBindingCollection [(RodBinding name=variant13 source=temp_results/MS10136.yes_collapse.final_contaminant_mapped.gvcf)]), (RodBindingCollection [(RodBinding name=variant14 source=temp_results/MS10137.yes_collapse.final_contaminant_mapped.gvcf)])] out=org.broadinstitute.gatk.engine.io.stubs.VariantContextWriterStub includeNonVariantSites=false annotateNDA=false heterozygosity=0.001 indel_heterozygosity=1.25E-4 standard_min_confidence_threshold_for_calling=10.0 standard_min_confidence_threshold_for_emitting=30.0 max_alternate_alleles=6 input_prior=[] sample_ploidy=2 annotation=[InbreedingCoeff, FisherStrand, QualByDepth, ChromosomeCounts, GenotypeSummaries, StrandOddsRatio] dbsnp=(RodBinding name= source=UNBOUND) filter_reads_with_N_cigar=false filter_mismatching_base_and_quals=false filter_bases_not_stored=false">
    ##GATKCommandLine=<ID=HaplotypeCaller,Version=3.3-0-g37228af,Date="Mon Mar 23 16:56:57 NZDT 2015",Epoch=1427083017252,CommandLineOptions="analysis_type=HaplotypeCaller input_file=[results/bams/MS10066.yes_collapse.final_contaminant_mapped.bam, results/bams/MS10066.no_collapse.final_contaminant_mapped.bam] showFullBamList=false read_buffer_size=null phone_home=AWS gatk_key=null tag=NA read_filter=[] intervals=null excludeIntervals=null interval_set_rule=UNION interval_merging=ALL interval_padding=0 reference_sequence=/Users/james/bioinformatics/ancient_dna_pipeline/ref/contamination.fa nonDeterministicRandomSeed=false disableDithering=false maxRuntime=-1 maxRuntimeUnits=MINUTES downsampling_type=BY_SAMPLE downsample_to_fraction=null downsample_to_coverage=250 baq=OFF baqGapOpenPenalty=40.0 refactor_NDN_cigar_string=false fix_misencoded_quality_scores=false allow_potentially_misencoded_quality_scores=false useOriginalQualities=false defaultBaseQualities=-1 performanceLog=null BQSR=null quantize_quals=0 disable_indel_quals=false emit_original_quals=false preserve_qscores_less_than=6 globalQScorePrior=-1.0 validation_strictness=SILENT remove_program_records=false keep_program_records=false sample_rename_mapping_file=null unsafe=null disable_auto_index_creation_and_locking_when_reading_rods=false no_cmdline_in_header=false sites_only=false never_trim_vcf_format_field=false bcf=false bam_compression=null simplifyBAM=false disable_bam_indexing=false generate_md5=false num_threads=1 num_cpu_threads_per_data_thread=1 num_io_threads=0 monitorThreadEfficiency=false num_bam_file_handles=null read_group_black_list=null pedigree=[] pedigreeString=[] pedigreeValidationType=STRICT allow_intervals_with_unindexed_bam=false generateShadowBCF=false variant_index_type=LINEAR variant_index_parameter=128000 logging_level=INFO log_to_file=null help=false version=false likelihoodCalculationEngine=PairHMM heterogeneousKmerSizeResolution=COMBO_MIN graphOutput=null bamOutput=null bamWriterType=CALLED_HAPLOTYPES disableOptimizations=false dbsnp=(RodBinding name= source=UNBOUND) dontTrimActiveRegions=false maxDiscARExtension=25 maxGGAARExtension=300 paddingAroundIndels=150 paddingAroundSNPs=20 comp=[] annotation=[ClippingRankSumTest, DepthPerSampleHC, StrandBiasBySample] excludeAnnotation=[SpanningDeletions, TandemRepeatAnnotator, ChromosomeCounts, FisherStrand, StrandOddsRatio, QualByDepth] debug=false useFilteredReadsForAnnotations=false emitRefConfidence=GVCF annotateNDA=false heterozygosity=0.001 indel_heterozygosity=1.25E-4 standard_min_confidence_threshold_for_calling=-0.0 standard_min_confidence_threshold_for_emitting=-0.0 max_alternate_alleles=6 input_prior=[] sample_ploidy=1 genotyping_mode=DISCOVERY alleles=(RodBinding name= source=UNBOUND) contamination_fraction_to_filter=0.0 contamination_fraction_per_sample_file=null p_nonref_model=null exactcallslog=null output_mode=EMIT_ALL_SITES allSitePLs=true sample_name=null kmerSize=[10, 25] dontIncreaseKmerSizesForCycles=false allowNonUniqueKmersInRef=false numPruningSamples=1 recoverDanglingHeads=false doNotRecoverDanglingBranches=false minDanglingBranchLength=4 consensus=false GVCFGQBands=[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 70, 80, 90, 99] indelSizeToEliminateInRefModel=10 min_base_quality_score=10 minPruning=2 gcpHMM=10 includeUmappedReads=false useAllelesTrigger=false phredScaledGlobalReadMismappingRate=45 maxNumHaplotypesInPopulation=128 mergeVariantsViaLD=false doNotRunPhysicalPhasing=true pair_hmm_implementation=VECTOR_LOGLESS_CACHING keepRG=null justDetermineActiveRegions=false dontGenotype=false errorCorrectKmers=false debugGraphTransformations=false dontUseSoftClippedBases=false captureAssemblyFailureBAM=false allowCyclesInKmerGraphToGeneratePaths=false noFpga=false errorCorrectReads=false kmerLengthForReadErrorCorrection=25 minObservationsForKmerToBeSolid=20 pcr_indel_model=CONSERVATIVE maxReadsInRegionPerSample=1000 minReadsPerAlignmentStart=5 activityProfileOut=null activeRegionOut=null activeRegionIn=null activeRegionExtension=null forceActive=false activeRegionMaxSize=null bandPassSigma=null maxProbPropagationDistance=50 activeProbabilityThreshold=0.002 min_mapping_quality_score=20 filter_reads_with_N_cigar=false filter_mismatching_base_and_quals=false filter_bases_not_stored=false">
    ##GVCFBlock=minGQ=0(inclusive),maxGQ=1(exclusive)
    ##INFO=<ID=AC,Number=A,Type=Integer,Description="Allele count in genotypes, for each ALT allele, in the same order as listed">
    ##INFO=<ID=AF,Number=A,Type=Float,Description="Allele Frequency, for each ALT allele, in the same order as listed">
    ##INFO=<ID=AN,Number=1,Type=Integer,Description="Total number of alleles in called genotypes">
    ##INFO=<ID=BaseQRankSum,Number=1,Type=Float,Description="Z-score from Wilcoxon rank sum test of Alt Vs. Ref base qualities">
    ##INFO=<ID=CCC,Number=1,Type=Integer,Description="Number of called chromosomes">
    ##INFO=<ID=ClippingRankSum,Number=1,Type=Float,Description="Z-score From Wilcoxon rank sum test of Alt vs. Ref number of hard clipped bases">
    ##INFO=<ID=DP,Number=1,Type=Integer,Description="Approximate read depth; some reads may have been filtered">
    ##INFO=<ID=DS,Number=0,Type=Flag,Description="Were any of the samples downsampled?">
    ##INFO=<ID=END,Number=1,Type=Integer,Description="Stop position of the interval">
    ##INFO=<ID=FS,Number=1,Type=Float,Description="Phred-scaled p-value using Fisher's exact test to detect strand bias">
    ##INFO=<ID=GQ_MEAN,Number=1,Type=Float,Description="Mean of all GQ values">
    ##INFO=<ID=GQ_STDDEV,Number=1,Type=Float,Description="Standard deviation of all GQ values">
    ##INFO=<ID=HWP,Number=1,Type=Float,Description="P value from test of Hardy Weinberg Equilibrium">
    ##INFO=<ID=HaplotypeScore,Number=1,Type=Float,Description="Consistency of the site with at most two segregating haplotypes">
    ##INFO=<ID=InbreedingCoeff,Number=1,Type=Float,Description="Inbreeding coefficient as estimated from the genotype likelihoods per-sample when compared against the Hardy-Weinberg expectation">
    ##INFO=<ID=MLEAC,Number=A,Type=Integer,Description="Maximum likelihood expectation (MLE) for the allele counts (not necessarily the same as the AC), for each ALT allele, in the same order as listed">
    ##INFO=<ID=MLEAF,Number=A,Type=Float,Description="Maximum likelihood expectation (MLE) for the allele frequency (not necessarily the same as the AF), for each ALT allele, in the same order as listed">
    ##INFO=<ID=MQ,Number=1,Type=Float,Description="RMS Mapping Quality">
    ##INFO=<ID=MQ0,Number=1,Type=Integer,Description="Total Mapping Quality Zero Reads">
    ##INFO=<ID=MQRankSum,Number=1,Type=Float,Description="Z-score From Wilcoxon rank sum test of Alt vs. Ref read mapping qualities">
    ##INFO=<ID=NCC,Number=1,Type=Integer,Description="Number of no-called samples">
    ##INFO=<ID=QD,Number=1,Type=Float,Description="Variant Confidence/Quality by Depth">
    ##INFO=<ID=ReadPosRankSum,Number=1,Type=Float,Description="Z-score from Wilcoxon rank sum test of Alt vs. Ref read position bias">
    ##INFO=<ID=SOR,Number=1,Type=Float,Description="Symmetric Odds Ratio of 2x2 contingency table to detect strand bias">
    ##contig=<ID=gi|5834843|ref|NC_001323.1|,length=16775>
    ##contig=<ID=gi|60101824|ref|NC_006853.1|,length=16338>
    ##contig=<ID=gi|17737322|ref|NC_002008.4|,length=16727>
    ##contig=<ID=gi|251831106|ref|NC_012920.1|,length=16569>
    ##contig=<ID=gi|223976078|ref|NC_012095.1|,length=16770>
    ##reference=file:///Users/james/bioinformatics/ancient_dna_pipeline/ref/contamination.fa
    #CHROM  POS ID  REF ALT QUAL    FILTER  INFO    FORMAT  MS10062 MS10066 MS10068 MS10069 MS10070 MS10129 MS10130 MS10131 MS10132 MS10133 MS10134 MS10135 MS10136 MS10137
    gi|17737322|ref|NC_002008.4|    15814   .   C   T   36001.14    PASS    AC=14;AF=1.00;AN=14;DP=917;FS=0.000;GQ_MEAN=2573.29;GQ_STDDEV=3313.62;MLEAC=14;MLEAF=1.00;MQ=49.38;MQ0=0;NCC=0;QD=33.10;SOR=0.729   GT:AD:DP:GQ:PL  1/1:0,62:62:99:2375,0   1/1:0,63:63:99:2385,0   1/1:0,16:16:99:552,0    1/1:0,262:262:99:10564,0    1/1:0,185:185:99:7514,0 1/1:0,7:7:99:227,0  1/1:0,7:7:99:231,0  1/1:0,34:34:99:1133,0   1/1:0,12:12:99:464,0    1/1:0,46:46:99:1836,0   1/1:0,8:8:99:226,0  1/1:0,27:27:99:1070,0   1/1:0,174:174:99:7029,0 1/1:0,11:11:99:420,0
    gi|17737322|ref|NC_002008.4|    16168   .   A   G   302.14  PASS    AC=1;AF=0.100;AN=10;BaseQRankSum=-2.515e+00;ClippingRankSum=0.451;DP=240;FS=18.493;GQ_MEAN=132.10;GQ_STDDEV=78.12;MLEAC=1;MLEAF=0.100;MQ=47.53;MQ0=0;MQRankSum=-6.150e-01;NCC=4;QD=5.40;ReadPosRankSum=0.516;SOR=3.247  GT:AD:DP:GQ:PL  0/0:22,0:22:90:0,90 0/0:13,0:13:90:0,90 ./.:0,0 1/1:28,28:56:99:337,0   0/0:53,0:53:99:0,135    ./.:0,0 ./.:0,0 0/0:15,0:15:99:0,135    0/0:7,0:7:99:0,135  0/0:17,0:17:99:0,129    ./.:0,0 0/0:5,0:5:45:0,45   0/0:47,0:47:99:0,135    0/0:4,0:4:90:0,90
    

    The output of the variant lines are as follows.

    gi|17737322|ref|NC_002008.4|    15814   .       C       T       36001.1 PASS    AC=14;AF=1.00;AN=14;DP=917;FS=0.000;GQ_MEAN=2573.29;GQ_STDDEV=3313.62;MLEAC=14;MLEAF=1.00;MQ=49
    .38;MQ0=0;NCC=0;QD=33.10;SOR=0.729       GT:AD:DP:GQ:PL  1/1:0,62:62:99:2375,0   1/1:0,63:63:99:2385,0   1/1:0,16:16:99:552,0    1/1:0,262:262:99:10564,0        1/1:0,185:185:
    99:7514,0 1/1:0,7:7:99:227,0      1/1:0,7:7:99:231,0      1/1:0,34:34:99:1133,0   1/1:0,12:12:99:464,0    1/1:0,46:46:99:1836,0   1/1:0,8:8:99:226,0      1/1:0,27:27:99:1070,0
       1/1:0,174:174:99:7029,0 1/1:0,11:11:99:420,0
    gi|17737322|ref|NC_002008.4|    16168   .       A       G       302.14  PASS    AC=1;AF=0.100;AN=10;BaseQRankSum=-2.515e+00;ClippingRankSum=0.451;DP=240;FS=18.493;GQ_MEAN=132.
    10;GQ_STDDEV=78.12;MLEAC=1;MLEAF=0.100;MQ=47.53;MQ0=0;MQRankSum=-6.150e-01;NCC=4;QD=5.40;ReadPosRankSum=0.516;SOR=3.247  GT:AD:DP:GQ:PL  0/0:22,0:22:90:0,90     0/0:13,0:13:90
    :0,90     1/1:0,16:16:99:552,0    1/1:28,28:56:99:337,0   0/0:53,0:53:99:0,135    1/1:0,7:7:99:227,0      1/1:0,7:7:99:231,0      0/0:15,0:15:99:0,135    0/0:7,0:7:99:0,135
          0/0:17,0:17:99:0,129    1/1:0,8:8:99:226,0      0/0:5,0:5:45:0,45       0/0:47,0:47:99:0,135    0/0:4,0:4:90:0,90
    

    As you can see the meta-information for the missing genotypes are placed onto the SNP at position 16168, and it is incorrectly given a call for some of the samples, such as the 3rd sample.

    Thanks James.

    stale 
    opened by theboocock 14
  • Merging 3000 VCFs using vcflib

    Merging 3000 VCFs using vcflib

    Hi Erik,

    Thanks for developing FreeBayes and vcflib and pointing us to vt. We have been using those tools in our pipelines and they worked really good.

    I have a quick question about vcflib and was wondering if you could help me. We have 3,000 whole exome VCFs (each with multiple samples) that we have to make them merge into one VCF. I really liked vcflib and how it works. Would you think if vcflib could handle this amount of VCF merging, and if yes, how long time and how much memory would you think it might need? We tried GATK CombineVariants and it estimates over 15 weeks…. Any advice will be highly appreciated!

    Thank you very much in advance.

    Best, Riyue Bao Senior Bioinformatician Center for Research Informatics | Biological Sciences Division The University of Chicago

    discussion stale 
    opened by riyuebao 12
  • Build fails on OS X yosemite

    Build fails on OS X yosemite

    Hullo

    I am trying to build on OS X and I get the following error

    darwin: make cd tabixpp && /Applications/Xcode.app/Contents/Developer/usr/bin/make ar -cru libtabix.a ar: no archive members specified usage: ar -d [-TLsv] archive file ... ar -m [-TLsv] archive file ... ar -m [-abiTLsv] position archive file ... ar -p [-TLsv] archive [file ...] ar -q [-cTLsv] archive file ... ar -r [-cuTLsv] archive file ... ar -r [-abciuTLsv] position archive file ... ar -t [-TLsv] archive [file ...] ar -x [-ouTLsv] archive [file ...] make[2]: *** [libtabix.a] Error 1 make[1]: *** [all-recur] Error 1 make: *** [tabixpp/tabix.o] Error 2

    It appears to be because in the tabixpp Makefile the $(LOBJS) value is not set to anything when it tries to complete the rule libtabix.a.

    Manually setting $(LOBJS) to tabix.o and removing the -s switch in the smithwaterman Makefile seems to let the make complete and produce programs that at least display their usage, but I am wondering if there are other object that should be included in LOBJS? I am also wondering why it builds ok on Linux without manually setting $LOBJS but not my OS X install.

    Thanks

    Pieta

    opened by PietaSchofield 11
  • vcf2fasta error

    vcf2fasta error

    Hi I am trying to use the vcf2fasta tool . I am getting an error : variant scaffold79:110 is not phased, cannot convert to fasta

    Can you help me out to resolve this.

    Best

    opened by vikaskumar1019 11
  • Time for a new release

    Time for a new release

    Debian would appreciate a new release, as we use vcflib v1.0.1 for the vg package, and currently they are 34 commits ahead https://github.com/vcflib/vcflib/compare/v1.0.1...a31f0578569a2dfb8ecc99d9cbde03cd5a0b2fd4

    Thanks!

    packaging 
    opened by mr-c 10
  • Updating the latest version on Conda

    Updating the latest version on Conda

    Dear there, it's still v1.0.3 when I install vcflib with Conda (nearly half a year since the last upload :) ), and the vcfwave is not included. Wondering if it's possible to update with the latest version? (I tried to install from the source but just tired to fix those errors encountered.) Thanks a lot in advance!

    opened by heroalone 1
  • Including vcflib headers now leaks a FORMAT macro, which conflicts with user constants named FORMAT regardless of scope or namespace

    Including vcflib headers now leaks a FORMAT macro, which conflicts with user constants named FORMAT regardless of scope or namespace

    After including Variant.h, I get a macro named FORMAT defined as '%', from WFA2-lib's utils/commons.h which now gets transitively included.

    That header also defines macros with other very common names, such as SPACE, HASH, MIN, rand_init, SWAP, etc. These all leak out of vcflib and end up defined in my user code.

    If I update vcflib in vg to the current version, I would have to go through he whole vg codebase and make sure that none of those tokens were used as the names of any functions, variables, or constants, in any scope or namespace. That seems impractical.

    Either vcflib should use a WFA2-lib that doesn't lay claim to so many token names (maybe by prefixing them all with the package name?), or vcflib should take steps to avoid leaking WAF2-lib macro definitions into code that depends only on vcflib. That could involve a bunch of #undef directives in vcflib, or it could involve not including any (or at least these) WFA2-lib headers in installed vcflib headers, and making the code that communicates with WFA2-lib include WFA2-lib headers only in its own compilation units, and not in the headers that define its interface.

    bug 
    opened by adamnovak 0
  • Installation does not work because packaged dependency headers are not installed

    Installation does not work because packaged dependency headers are not installed

    Describe the bug

    When I try to use make install to install vcflib, the WFA headers aren't installed.

    To Reproduce

    1. Clone vcflib and go into it.
    2. mkdir build && cd build
    3. cmake -DCMAKE_VERBOSE_MAKEFILE:BOOL=ON -DBUILD_ONLY_LIB=ON -DCMAKE_INSTALL_PREFIX=pwd/inst .. && make -j15 && make install
    4. Look in inst/include. In particular, inst/include/Variant.h will #include "bindings/cpp/WFAligner.hpp", but there is no inst/bindings/ directory installed.

    Expected behavior

    make install should install all necessary headers for compiling against vcflib. Ideally into directories by package name ("bindings" at the root of the include directory might already be occupied by some other installed library).

    Screenshots

    N/A

    Additional context

    Toggling BUILD_ONLY_LIB on and off doesn't seem to help, but having it on reduces the build time for testing.

    bug 
    opened by adamnovak 0
  • vcffilter killed in a pipe

    vcffilter killed in a pipe

    I tried to do vcfintersect in line with vcffilter in a unix pipe but it is always killed with no error message. This is how the command looks like:

    file=in.vcf.gz
    vcfintersect -b $BED $file | vcffilter -f "MQ > 29" - > ${file%.vcf.gz}_filtered.vcf.gz
    

    I tried to switch them but I got no luck also.

    If I do them separately, it works, but I don't want to make too many intermediate files if possible.

    vcfintersect -b $BED $file > ${file%.vcf.gz}_ints.vcf.gz
    vcffilter -f "MQ > 29" - > ${file%.vcf.gz}_mq30.vcf.gz
    

    I appreciate any help with this.

    EDIT: It is run in a cluster and the error message from the cluster is as follows:

    terminate called after throwing an instance of 'std::bad_alloc'
      what():  std::bad_alloc
    srun: error: i23r06c03s09: task 0: Aborted (core dumped)
    

    Does this mean the process are too memory hungry to be run in a pipe?

    opened by sagitaninta 0
  • wcFst produces neither output nor error message

    wcFst produces neither output nor error message

    Using the most recent conda installation

    wcFst -t 0,1 -b 2 -f sample.vcf -y PL > wcFst.txt

    INFO: there are 2 individuals in the target INFO: target ids: 0,1 INFO: there are 1 individuals in the background INFO: background ids: 2 INFO: file: sample.vcf INFO: setting genotype likelihood format to: PL

    Produces a blank wcFst.txt. No error messages or anything at all.

    bug 
    opened by PrestonMcDonald 0
Releases(v1.0.3)
  • v1.0.3(Jan 22, 2022)

    ChangeLog v1.0.3 (20220122)

    This is a maintenance release of vcflib.

    • Merge intervaltree changes (thanks @jnunn and @timmassingham)
    • Built with gcc-11
    • Fix issue #251 hapLrt: fix segfault when accessing genotype field. (thanks @mphschmitt)
    • Fix vcfflatten: fix segfault when no 'AF' field is present (#47, thanks @mphschmitt)
    • Fixes on vcfnulldotslashdot #310 (thanks @WinterFor)
    • Fix issue #301: Replace raw pointer usage with std::unique_ptr #306 (thanks @Glebanister)
    • Fix man page installation #321 (thanks @alexreg)
    • Use guix shell instead of guix environment for development
    • Regenerated online docs
    • README: add matrix badge (removed gitter badge)
    Source code(tar.gz)
    Source code(zip)
    vcflib-1.0.3-src.tar.gz(20.26 MB)
  • v1.0.2(Jan 4, 2021)

    ChangeLog v1.0.2 (20210104)

    This is a maintenance release of vcflib, mostly improving the build system, CI and generating markdown docs as well as man pages.

    • Removed tabixpp and htslib source dependencies, i.e., we are now using the distro provided libraries and include files through pkg-config. See also the README
    • Removed the tabixpp+htslib git submodules
    • Generalise and document the cmake build system
    • Added tests to the cmake build system and build instructions to README
    • Added support for ARM64 and PowerPC, see #292 (thanks @genisysram and @mr-c)
    • Added github actions for the issue tracker
    • Added githum CI
    • Updated header files in src with copyright/license info, see #16
    • Created markdown docs and man pages for all utilities. Created a script bin2md for markdown generation and use pandoc for the man page generation.
    Source code(tar.gz)
    Source code(zip)
    vcflib-1.0.2-src.tar.gz(19.91 MB)
  • v1.0.1(Sep 30, 2019)

  • v1.0.0(Sep 3, 2019)

    This synchronizes a number of changes that the vgteam has implemented on top of vcflib, including a new intervaltree implementation. Enjoy!

    Source code(tar.gz)
    Source code(zip)
  • v1.0.0-rc3(Aug 23, 2019)

    🐍 Some of the vcf* tools were actually Python 2.x scripts, which is EOL soon. These are now Python 3.x and use #!/usr/bin/env python3 hashbangs.

    ⚠️ Please use git clone --recursive when building this package because it uses sub-modules which are not included the source tarballs below.

    Source code(tar.gz)
    Source code(zip)
  • v1.0.0-rc1(Feb 8, 2016)

Structural variant detection and association testing

wham The wham suite consists of two programs, wham and whamg. wham, the original tool, is a very sensitive method with a high false discovery rate. Th

Zev Kronenberg 91 Sep 2, 2022
Chaste - Cancer Heart And Soft Tissue Environment - main public repository

Chaste - Cancer Heart And Soft Tissue Environment - main public repository

Chaste - Cancer Heart and Soft Tissue Environment 91 Sep 15, 2022
Public/backup repository of the GROMACS molecular simulation toolkit. Please do not mine the metadata blindly; we use https://gitlab.com/gromacs/gromacs for code review and issue tracking.

Welcome to the official version of GROMACS! If you are familiar with Unix, it should be fairly trivial to compile and install GROMACS. GROMACS uses o

Gromacs 457 Sep 22, 2022
C++ library and cmdline tools for parsing and manipulating VCF files

vcflib A C++ library for parsing and manipulating VCF files. overview The Variant Call Format (VCF) is a flat-file, tab-delimited textual format that

null 512 Sep 21, 2022
This is a collection of tools for creating and manipulating BitTorrent v2 torrent files

torrent tools This is a collection of tools for creating and manipulating BitTorrent v2 torrent files. torrent-new can create hybrid torrents, but the

Arvid Norberg 8 Jun 1, 2022
A combined suite of utilities for manipulating binary data files.

BinaryTools A combined suite of utilities for manipulating binary data files. It was developed for use on Windows but might compile on other systems.

David Walters 4 Jul 12, 2022
CfgManipulator is a powerful tool for manipulating configuration files.

CfgManipulator is a powerful tool for manipulating configuration files

Sanya 1 Feb 3, 2022
C library for encoding, decoding and manipulating JSON data

Jansson README Jansson is a C library for encoding, decoding and manipulating JSON data. Its main features and design principles are: Simple and intui

Petri Lehtinen 2.7k Sep 18, 2022
A C++ header-only library for creating, displaying, iterating and manipulating dates

The ASAP date/time library for beautiful C++ code ASAP is a small, header-only date-time library for C++11 and beyond. It is heavily inspired by my gr

Leonardo Guilherme de Freitas 55 Sep 5, 2022
C++ (with python bindings) library for easily reading/writing/manipulating common animation particle formats such as PDB, BGEO, PTC. See the discussion group @ http://groups.google.com/group/partio-discuss

Partio - A library for particle IO and manipulation This is the initial source code release of partio a tool we used for particle reading/writing. It

Walt Disney Animation Studios 402 Sep 13, 2022
Microsoft 2.5k Sep 22, 2022
Slow5tools is a toolkit for converting (FAST5 <-> SLOW5), compressing, viewing, indexing and manipulating data in SLOW5 format.

slow5tools Slow5tools is a simple toolkit for converting (FAST5 <-> SLOW5), compressing, viewing, indexing and manipulating data in SLOW5 format. Abou

Hasindu Gamaarachchi 51 Sep 9, 2022
Just a basic mini library for parsing simple files that only have variables written and with Lua extension.

C++ Parser Lua file config Just a basic mini library for parsing simple files that only have variables written and with Lua extension. Note: At the mo

Marcos Oliveira 3 Dec 26, 2021
High performance library for creating, modiyfing and parsing PDF files in C++

Welcome to PDF-Writer. A Fast and Free C++ Library for Creating, Parsing an Manipulating PDF Files and Streams. Documentation is available here. Proje

gal kahana 658 Sep 20, 2022
A lightweight C++14 parsing library for tmx map files created with the Tiled map editor

tmxlite Description A lightweight C++14 parsing library for tmx map files created with the Tiled map editor. Requires no external linking, all depende

Matt Styles 321 Sep 10, 2022
Hibizcus is a collection of tools - Font proofing and debugging tools

Hibizcus Font proofing and debugging tools. Written by: Muthu Nedumaran Hibizcus is a collection of tools written to proof and debug in-house develope

Muthu Nedumaran 20 Sep 17, 2022
Suckless-tools - My fork of suckless tools.

suckless-tools Here is my fork of suckless tools. I didn't include tabbed, i was using but not actively. I am using xfce4-terminal instead of st. Beca

null 2 Jan 7, 2022
The Vulkan Profiles Tools are a collection of tools delivered with the Vulkan SDK for Vulkan application developers to leverage Vulkan Profiles while developing a Vulkan application

Copyright © 2021-2022 LunarG, Inc. Vulkan Profiles Tools (BETA) The Vulkan Profiles Tools are a collection of tools delivered with the Vulkan SDK for

The Khronos Group 61 Sep 17, 2022
Helpful files and tools to get ffxiv running without the native launcher

ffxiv-on-mac About Installing What works What doesn't work Troubleshooting About ffxiv-on-mac is a set of files and scripts for running Final Fantasy

null 18 Apr 4, 2022
Simple command line tools to create/extract X4 .cat+.dat files

x4cat Simple command line tools to to create/extract X4 .cat+.dat files x4encat Usage: x4encat <archive name> Looks for a directory named <archive nam

Alexander Sago 1 Oct 31, 2021