packer-cn/vendor/github.com/ulikunitz/xz
Adrien Delorme 9f82b75e57 Use the hashicorp/go-getter to download files
* removed packer.Cache and references since packer.Cache is never used except in the download step. The download step now uses the new func packer.CachePath(targetPath) for this, the behavior is the same.
* removed download code from packer that was reimplemented into the go-getter library: progress bar, http download restart, checksuming from file, skip already downloaded files, symlinking, make a download cancellable by context.
* on windows if packer is running without symlinking rights and we are getting a local file, the file will be copied instead to avoid errors.
* added unit tests for step_download that are now CI tested on windows, mac & linux.
* files are now downloaded under cache dir `sha1(filename + "?checksum=" + checksum) + file_extension`
* since the output dir is based on the source url and the checksum, when the checksum fails, the file is auto deleted.
* a download file is protected and locked by a file lock,
* updated docs
* updated go modules and vendors
2019-03-13 12:11:58 +01:00
..
internal Actualy add it 2018-07-25 02:36:18 +00:00
lzma Use the hashicorp/go-getter to download files 2019-03-13 12:11:58 +01:00
.gitignore Use the hashicorp/go-getter to download files 2019-03-13 12:11:58 +01:00
LICENSE Actualy add it 2018-07-25 02:36:18 +00:00
README.md Actualy add it 2018-07-25 02:36:18 +00:00
TODO.md Use the hashicorp/go-getter to download files 2019-03-13 12:11:58 +01:00
bits.go Actualy add it 2018-07-25 02:36:18 +00:00
crc.go Actualy add it 2018-07-25 02:36:18 +00:00
example.go Use the hashicorp/go-getter to download files 2019-03-13 12:11:58 +01:00
format.go Actualy add it 2018-07-25 02:36:18 +00:00
fox.xz Actualy add it 2018-07-25 02:36:18 +00:00
lzmafilter.go Actualy add it 2018-07-25 02:36:18 +00:00
make-docs Use the hashicorp/go-getter to download files 2019-03-13 12:11:58 +01:00
reader.go Actualy add it 2018-07-25 02:36:18 +00:00
writer.go Actualy add it 2018-07-25 02:36:18 +00:00

README.md

Package xz

This Go language package supports the reading and writing of xz compressed streams. It includes also a gxz command for compressing and decompressing data. The package is completely written in Go and doesn't have any dependency on any C code.

The package is currently under development. There might be bugs and APIs are not considered stable. At this time the package cannot compete with the xz tool regarding compression speed and size. The algorithms there have been developed over a long time and are highly optimized. However there are a number of improvements planned and I'm very optimistic about parallel compression and decompression. Stay tuned!

Using the API

The following example program shows how to use the API.

package main

import (
    "bytes"
    "io"
    "log"
    "os"

    "github.com/ulikunitz/xz"
)

func main() {
    const text = "The quick brown fox jumps over the lazy dog.\n"
    var buf bytes.Buffer
    // compress text
    w, err := xz.NewWriter(&buf)
    if err != nil {
        log.Fatalf("xz.NewWriter error %s", err)
    }
    if _, err := io.WriteString(w, text); err != nil {
        log.Fatalf("WriteString error %s", err)
    }
    if err := w.Close(); err != nil {
        log.Fatalf("w.Close error %s", err)
    }
    // decompress buffer and write output to stdout
    r, err := xz.NewReader(&buf)
    if err != nil {
        log.Fatalf("NewReader error %s", err)
    }
    if _, err = io.Copy(os.Stdout, r); err != nil {
        log.Fatalf("io.Copy error %s", err)
    }
}

Using the gxz compression tool

The package includes a gxz command line utility for compression and decompression.

Use following command for installation:

$ go get github.com/ulikunitz/xz/cmd/gxz

To test it call the following command.

$ gxz bigfile

After some time a much smaller file bigfile.xz will replace bigfile. To decompress it use the following command.

$ gxz -d bigfile.xz