Entwine - point cloud organization for massive datasets

Overview

Entwine logo

Build Status

OSX Linux Windows Docs Conda Docs Docker

Entwine is a data organization library for massive point clouds, designed to conquer datasets of hundreds of billions of points as well as desktop-scale point clouds. Entwine can index anything that is PDAL-readable, and can read/write to a variety of sources like S3 or Dropbox. Builds are completely lossless, so no points will be discarded even for terabyte-scale datasets.

Check out the client demos, showcasing Entwine output with Potree, Plas.io, and Cesium clients.

Usage

Getting started with Entwine is easy with Docker. First, we can index some public data:

mkdir ~/entwine
docker run -it -v ~/entwine:/entwine connormanning/entwine build \
    -i https://data.entwine.io/red-rocks.laz \
    -o /entwine/red-rocks

Now we have our output at ~/entwine/red-rocks. We could have also passed a directory like -i ~/county-data/ to index multiple files. Now we can statically serve ~/entwine with a simple HTTP server:

docker run -it -v ~/entwine:/var/www -p 8080:8080 connormanning/http-server

And view the data with Potree and Plasio.

To view the data in Cesium, see the EPT Tools project.

Going further

For detailed information about how to configure your builds, check out the configuration documentation. Here, you can find information about reprojecting your data, using configuration files and templates, enabling S3 capabilities, producing Cesium 3D Tiles output, and all sorts of other settings.

To learn about the Entwine Point Tile file format produced by Entwine, see the file format documentation.

Issues
  • running out of memory during dataset inference

    running out of memory during dataset inference

    I have a bunch of 500MB to 1GB laz files in google cloud storage, I'm treating them like http sources, but when I run entwine on more then 3 or 4 of them at once entwine just dies at the Performing dataset inference step, drastically increasing the memory allowed to docker will cause it to work.

    So a couple related questions:

    • is there a way to pre run the inference stuff?
    • am I misreading the documentation and I should just be running it on each image one at a time and merging it into a big pyramid?
    opened by calvinmetcalf 21
  • Reading .npy causes segfault

    Reading .npy causes segfault

    It appears while reading a .npy file an error is thrown, I am unsure if this sits at the PDAL level or the entwine but have attached all relevant information. Demo .npy file

    I built a file to properly install numpy while #114 is fixed. Which is available here:

    FROM connormanning/entwine:latest 
    RUN apt-get install -y \
    	python-numpy \
    	python-pip 
    RUN pip install numpy
    ENTRYPOINT ["entwine"]
    

    Stacktrace:

    #0  0x00007fffdb951eba in PyErr_Occurred () from /usr/lib/x86_64-linux-gnu/libpython2.7.so.1.0
    #1  0x00007fffaccaf5a2 in ?? () from /usr/lib/python2.7/dist-packages/numpy/core/multiarray.x86_64-linux-gnu.so
    #2  0x00007fffacc8870e in ?? () from /usr/lib/python2.7/dist-packages/numpy/core/multiarray.x86_64-linux-gnu.so
    #3  0x00007ffff75568bf in pdal::Streamable::execute(pdal::StreamPointTable&) () from /usr/lib/libpdal_base.so.6
    #4  0x00007ffff7af3955 in entwine::Executor::run<entwine::PooledPointTable> (
        this=0x7ffff7dd3d40 <entwine::Executor::get()::e>, table=..., 
        path="/home/batman/one/xx/projects/map3d/scripts/tmp.qp560DsqXw/lanes/lanes10003.npy", reprojection=0x0, 
        transform=0x0, preserve=std::vector of length 0, capacity 0) at /var/entwine/entwine/util/executor.hpp:215
    #5  0x00007ffff7b2d2e5 in entwine::Executor::preview (this=0x7ffff7dd3d40 <entwine::Executor::get()::e>, 
        path="/home/batman/one/xx/projects/map3d/scripts/tmp.qp560DsqXw/lanes/lanes10003.npy", reprojection=0x0)
        at /var/entwine/entwine/util/executor.cpp:169
    #6  0x00007ffff7b12c99 in entwine::Scan::add (this=0x7fffffffdee0, f=..., 
        localPath="/home/batman/one/xx/projects/map3d/scripts/tmp.qp560DsqXw/lanes/lanes10003.npy")
        at /var/entwine/entwine/builder/scan.cpp:141
    #7  0x00007ffff7b12873 in entwine::Scan::<lambda()>::operator()(void) const (__closure=0x7fffe1731320)
        at /var/entwine/entwine/builder/scan.cpp:133
    #8  0x00007ffff7b15036 in std::_Function_handler<void(), entwine::Scan::add(entwine::FileInfo&)::<lambda()> >::_M_invoke(const std::_Any_data &) (__functor=...) at /usr/include/c++/7/bits/std_function.h:316
    #9  0x00007ffff7b35216 in std::function<void ()>::operator()() const (this=0x7fffe1731320)
        at /usr/include/c++/7/bits/std_function.h:706
    #10 0x00007ffff7b3413b in entwine::Pool::work (this=0x555555865b40) at /var/entwine/entwine/util/pool.cpp:107
    #11 0x00007ffff7b33b6d in entwine::Pool::<lambda()>::operator()(void) const (__closure=0x5555558216e8)
        at /var/entwine/entwine/util/pool.cpp:43
    #12 0x00007ffff7b34a27 in std::__invoke_impl<void, entwine::Pool::go()::<lambda()> >(std::__invoke_other, entwine::Pool::<lambda()> &&) (__f=...) at /usr/include/c++/7/bits/invoke.h:60
    #13 0x00007ffff7b34832 in std::__invoke<entwine::Pool::go()::<lambda()> >(entwine::Pool::<lambda()> &&) (
        __fn=...) at /usr/include/c++/7/bits/invoke.h:95
    #14 0x00007ffff7b34c94 in std::thread::_Invoker<std::tuple<entwine::Pool::go()::<lambda()> > >::_M_invoke<0>(std::_Index_tuple<0>) (this=0x5555558216e8) at /usr/include/c++/7/thread:234
    #15 0x00007ffff7b34c50 in std::thread::_Invoker<std::tuple<entwine::Pool::go()::<lambda()> > >::operator()(void) (this=0x5555558216e8) at /usr/include/c++/7/thread:243
    #16 0x00007ffff7b34c20 in std::thread::_State_impl<std::thread::_Invoker<std::tuple<entwine::Pool::go()::<lambda()> > > >::_M_run(void) (this=0x5555558216e0) at /usr/include/c++/7/thread:186
    #17 0x00007ffff6be5733 in ?? () from /usr/lib/x86_64-linux-gnu/libstdc++.so.6
    #18 0x00007ffff567e6db in start_thread (arg=0x7fffe1732700) at pthread_create.c:463
    #19 0x00007ffff62a188f in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:95
    
    
    opened by ProjectBarks 16
  • Support of additional point dimensions (e.g. classification) in output Cesium model.

    Support of additional point dimensions (e.g. classification) in output Cesium model.

    Hello again,

    Sorry if this is addressed somewhere, but it's not clear to me if the current version of Entwine supports the preservation of non-color, non-position point dimensions (e.g. classification) when converting to Cesium models.

    I assume this information would live in the batch table of the output cesium 3d tiles.

    It's hinted at in the discussion of the following issues & PRs, but a clear answer is not provided, nor could I find it in the documentation:

    • https://github.com/connormanning/entwine/issues/59
    • https://github.com/connormanning/entwine/pull/85
    • https://github.com/connormanning/entwine/issues/81

    Thanks for your help.

    opened by devdylan 14
  • no driver for s3

    no driver for s3

    looking at the docs, I should be able to run

    docker run -it  -v `pwd`:`pwd` -w `pwd` --rm connormanning/entwine build -i s3://iowa-lidar/iowa/ -o ./some/directory
    

    but doing so and I get the error Encountered an error: No driver for s3://iowa-lidar/iowa/*

    opened by calvinmetcalf 14
  • Minio S3 support

    Minio S3 support

    How can I store Entwine output on Minio (which claims to be fully S3 compatible)?

    I'm trying to pass the arbiter credentials via environment variables like this (and as you can see, I needed to employ a little trick, since I can't set the (complete) AWS endpoint URL via the environment: I add the resulting URL as Docker host alias to point back to localhost) (credentials ommitted):

    docker run --add-host="entwine.s3.amazonaws.com:127.0.0.1" -e "CURL_VERBOSE=1" -e "AWS_ACCESS_KEY_ID=xxx" -e "AWS_SECRET_ACCESS_KEY=xxx" --net=host --rm -it connormanning/entwine build -i https://entwine.io/sample-data/red-rocks.laz -o s3://entwine/red-rocks

    I confirmed that the credentials are correct by using mc and s3cmd to upload stuff to my local Minio server to the correct bucket. The problem is that I'm still getting 403 errors, so I'm wondering if there is a guide how to properly configure Entwine to work with Minio.

    opened by greuff 11
  • Loss of detail in cesium output

    Loss of detail in cesium output

    Hi

    I'm trying to convert a point cloud to cesium format and seem to be losing points in the output in some step.

    When i convert the input data using entwine to a normal fileset and view it in potree the output looks like this:

    potree

    Command line and output:

    output-potree.log

    But when I make a cesium fileset the output is missing most of its points:

    cesium

    Command line and output:

    output-cesium.log

    cesium-intensity.json is cesium.json with coloring: "intensity". There is no change in the point count with the normal cesium.json

    I have tested the following things but nothing seems to fix this:

    • Changing tree depth settings in entwine
    • Reprojecting the input data with pdal to EPSG:4326 before giving it to entwine
    • Setting absolute: true

    The entwine output seems to indicate that it has added all the points but is it somehow guessing the type conversions wrong so that the points are quantized to same locations? The values for offset and bounds are different in the outputs.

    Lasinfo output from the input file:

    lasinfo.txt

    opened by kluthje 11
  • Creating big 3d-tiles

    Creating big 3d-tiles

    Hi,

    Is it possible to proceed creating 3d tiles index for cesium if previoosly command was interrupted (say due to SSH)? If source data is pretty big (hundreds of gb) generating tiles would take long, so ability to continue previous job would be very useful.

    thanks,

    Alex.

    opened by aleksandrmelnyk 10
  • Problem to create entwine index

    Problem to create entwine index

    I try to load 15 pointcloud tiles ( source file http://dl.mapgears.com/mg-laz.tar) in an entwine index. Entwine failed to load 3 of those files ( 278-5048_rgb.laz, 278-5047_rgb.laz and 276-5049_rgb.laz).

    my script looks like this:

    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/275-5047_rgb.laz -o /data/greyhound/RDP_RMI -b "[269000, 5034000, -100,308000, 5066000, 150]"
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/275-5048_rgb.laz -o /data/greyhound/RDP_RMI
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/275-5049_rgb.laz -o /data/greyhound/RDP_RMI
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/276-5047_rgb.laz -o /data/greyhound/RDP_RMI
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/276-5048_rgb.laz -o /data/greyhound/RDP_RMI
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/276-5049_rgb.laz -o /data/greyhound/RDP_RMI
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/277-5047_rgb.laz -o /data/greyhound/RDP_RMI
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/277-5048_rgb.laz -o /data/greyhound/RDP_RMI
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/277-5049_rgb.laz -o /data/greyhound/RDP_RMI
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/278-5047_rgb.laz -o /data/greyhound/RDP_RMI
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/278-5048_rgb.laz -o /data/greyhound/RDP_RMI
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/278-5049_rgb.laz -o /data/greyhound/RDP_RMI
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/279-5047_rgb.laz -o /data/greyhound/RDP_RMI
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/279-5048_rgb.laz -o /data/greyhound/RDP_RMI
    docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/279-5049_rgb.laz -o /data/greyhound/RDP_RMI
    

    At my third try I've finally received a real error message of bad memory allocation. Dont know why but Ithe first time I've retry to load this file, entwine crashes wihout any error message!

    Here's my log

    # docker run -it -v /opt/data/:/data connormanning/entwine entwine build -i /data/LAS/278-5048_rgb.laz -o /data/greyhound/RDP_RMI
    
    Continuing previous index...
    
    Input:
            Building from 13 source files
            Trust file headers? yes
            Work threads: 3
            Clip threads: 6
    Output:
            Output path: file:///data/greyhound/RDP_RMI/
            Temporary path: tmp/
            Compressed output? yes
    Tree structure:
            Null depth: 6
            Base depth: 10
            Cold depth: lossless
            Mapped depth: 13
            Sparse depth: 13
            Chunk size: 262144 points
            Dynamic chunks? yes
            Prefix IDs? no
            Build type: hybrid
            Point count hint: 13740813 points
    Geometry:
            Conforming bounds: [(269000.00000, 5034000.00000, -100.00000), (308000.00000, 5066000.00000, 150.00000)]
            Cubic bounds: [(268990.00000, 5030490.00000, -19485.00000), (308010.00000, 5069510.00000, 19535.00000)]
            Reprojection: (none)
            Storing dimensions: [X, Y, Z, Intensity, ReturnNumber, NumberOfReturns, ScanDirectionFlag, EdgeOfFlightLine, Classification, ScanAngleRank, UserData, PointSourceId, GpsTime, Red, Green, Blue, Origin]
    
    Adding 12 - /data/LAS/278-5048_rgb.laz
     A: 1048576 C: 1 H: 38
            Pushes complete - joining...
    Unknown error during /data/LAS/278-5048_rgb.laz
    terminate called after throwing an instance of 'std::bad_alloc'
      what():  std::bad_alloc
    Got error 11
    entwine[0x41b00b]
    /lib/x86_64-linux-gnu/libc.so.6(+0x352f0)[0x7f6c650ec2f0]
    /lib/x86_64-linux-gnu/libc.so.6(abort+0x2d6)[0x7f6c650ee036]
    /usr/lib/x86_64-linux-gnu/libstdc++.so.6(_ZN9__gnu_cxx27__verbose_terminate_handlerEv+0x16d)[0x7f6c65a0006d]
    /usr/lib/x86_64-linux-gnu/libstdc++.so.6(+0x5eee6)[0x7f6c659fdee6]
    /usr/lib/x86_64-linux-gnu/libstdc++.so.6(+0x5ef31)[0x7f6c659fdf31]
    /usr/lib/libentwine.so(+0x59beb)[0x7f6c66812beb]
    

    Here's my memory log just before failure:

    [email protected]:/opt/data/MNT# free -m
                 total       used       free     shared    buffers     cached
    Mem:         16491      16343        147          0          0         14
    -/+ buffers/cache:      16328        162
    Swap:         4095       4004         91
    

    If I create a new Entwine index with only this pointcloud file, it work well. I hope you will be able to reproduce this problem

    opened by smercier 10
  • Unable to process pointcloud

    Unable to process pointcloud "No type found for undefined dimension"

    I am attempting to re-index pointclouds that have previously been indexed with (docker based) Entwine 1.x. I am using the same source LAS files for the indexing however, with the newer version of Entwine each LAS file seems to generate the following:

    (readers.las Error) Global encoding WKT flag not set for point format 6 - 10. Exception in pool task: No type found for undefined dimension. SRS could not be determined

    And the scan ends stating that SRS could not be determined and with very bare output, if I attempt a build using the scan output, as well as the warnings for each LAS file I get an error at the end stating that no points were found.

    I have run the exact same LAS file though 1.3 and it still complains about point formats but completes

    (readers.las Error) Invalid SRS specification. GeoTiff not allowed with point formats 6 - 10. Writing details to /tmp/out.entwine-inference...

    I am invoking with: docker run -it -v pwd:/tmp connormanning/entwine:1.3 infer -i /tmp/Pontshill_2_000015.las -o /tmp/out

    and docker run -it -v pwd:/tmp connormanning/entwine:latest scan -i /tmp/Pontshill_2_000015.las -o /tmp/out2

    I have tried forcing an SRS via a config file: { "reprojection": { "in": "EPSG:27700", "out": "EPSG:27700", "hammer": true }, "input": "/tmp/Pontshill_2_000015.las", "output": "/tmp/scanout" }

    which gives the No points found! error:

    docker run -it -v pwd:/tmp connormanning/entwine:latest scan -c /tmp/scan.config Scanning: Input: /tmp/Pontshill_2_000015.las Threads: 8 Reprojection: EPSG:27700 (OVERRIDING file headers) -> EPSG:27700 Trust file headers? yes

    1 / 1: /tmp/Pontshill_2_000015.las (readers.las Error) Global encoding WKT flag not set for point format 6 - 10. (readers.las Error) Global encoding WKT flag not set for point format 6 - 10. Exception in pool task: No type found for undefined dimension. Encountered an error: No points found! Exiting.

    Is there something I am missing?

    opened by Hisol 9
  • entwine 1.1.0 build fails - jsoncpp issues

    entwine 1.1.0 build fails - jsoncpp issues

    Entwine 1.1.0 is failing to build on :

    • centos 7
    • gcc 4.8.5 (and 6.2.1)

    with dependences:

    • PDAL (master) in /usr/local, built from source
    • jsoncpp 1.8.0 in /usr/local, built from source

    First error:

    /local/build/entwine/entwine/third/arbiter/arbiter.cpp:2609:47: error: invalid conversion from ‘const void*’ to ‘void*’ [-fpermissive]
             if (BIO* bio = BIO_new_mem_buf(s.data(), -1))
    

    ...and then, using -fpermissive:

    [ 98%] Linking CXX executable entwine
    /usr/bin/ld: warning: libjsoncpp.so.11, needed by /usr/lib/gcc/x86_64-redhat-linux/4.8.5/../../../libpdal_base.so, may conflict with libjsoncpp.so.0
    /usr/bin/ld: CMakeFiles/kernel.dir/build.cpp.o: undefined reference to symbol '_ZN4Json5ValueC1Em'
    /usr/local/lib64/libjsoncpp.so.11: error adding symbols: DSO missing from command line
    collect2: error: ld returned 1 exit status
    make[2]: *** [kernel/entwine] Error 1
    make[1]: *** [kernel/CMakeFiles/kernel.dir/all] Error 2
    make[1]: *** Waiting for unfinished jobs....
    

    Not sure what this means. I can't remove centos default jsoncpp, I figured placing a newer version in /usr/local would be the fix - PDAL is linked against the newer jsoncpp just fine.

    Advice appreciated!

    opened by adamsteer 9
  • Merge command line

    Merge command line

    Hi

    I try entwine the merge command line on my Linux server but I've got an error. Is there an option I should add in the build command to succeed?

    # docker run -it -v /opt/data/greyhound/:/data connormanning/entwine entwine build -r EPSG:2950 EPSG:3857 -s 1 4 -i /data/270_5035.las -o /data/270
    # docker run -it -v /opt/data/greyhound/:/data connormanning/entwine entwine build -r EPSG:2950 EPSG:3857 -s 2 4 -i /data/270_5036.las -o /data/270
    # docker run -it -v /opt/data/greyhound/:/data connormanning/entwine entwine build -r EPSG:2950 EPSG:3857 -s 3 4 -i /data/270_5037.las -o /data/270
    # docker run -it -v /opt/data/greyhound/:/data connormanning/entwine entwine build -r EPSG:2950 EPSG:3857 -s 4 4 -i /data/270_5038.las -o /data/270
    # docker run -it -v /opt/data/greyhound/:/data connormanning/entwine entwine merge /data/270
    Waking up base
    Merging /data/270...
        1 / 4 done.
        2 / 4Waking up base
     merging...Encountered an error: Invalid manifest paths
    Exiting.
    

    Thank you

    opened by smercier 9
  • use las 1.4

    use las 1.4

    Use las 1.4 instead of las 1.2 This might have side effect with third party software, used to directly read 1.2 laz tiles, but pdal readers.ept should managed it.

    opened by gui2dev 1
  • using deprecated flags throws error rather than warning

    using deprecated flags throws error rather than warning

    I ran into this trying to port some code that happen to use --noTrustHeaders up to entwine 2.2.0 MWE:

    #conda create --name entwine-220-env
    #conda activate entwine-220-env
    #conda install -y -c conda-forge entwine=2.2.0
    
    $ entwine info --noTrustHeaders
    Encountered an error: Invalid argument: --noTrustHeaders
    Exiting.
    

    It looks like there is some kind of deprecation warning. I guess I would expect using those flags to throw the message and switch to deep behind the scenes rather than an error. Is this correct behavior?

    Also, is there a place where all the deprecated flags are listed? I did not see anything in Release notes, I ended up downloading a couple different versions and running entwine scan --help to figure out when options got dropped.

    opened by Crghilardi 0
  • OriginId dimension non added to the data

    OriginId dimension non added to the data

    Hello,

    When I build data from LAZ files, I do not have the OriginId dimension added (or I do not find it).

    Reading the documentation, I understand that this dimension should be added by default (we can only disabled this).

    For example: (entwine 2.2.0)

    entwine build -i https://data.entwine.io/red-rocks.laz -o red-rocks
    

    No "OriginId" in red-rocks/ept.json schema neither than in the LAZ files:

    pdal info red-rocks/ept-data/0-0-0-0.laz -p 0,0,0
    
    {
      "file_size": 83192,
      "filename": "red-rocks/ept-data/0-0-0-0.laz",
      "now": "2022-02-17T14:49:28+0100",
      "pdal_version": "2.3.0 (git-version: bc9604)",
      "points":
      {
        "point":
        {
          "Blue": 150,
          "Classification": 0,
          "EdgeOfFlightLine": 0,
          "GpsTime": 0,
          "Green": 170,
          "Intensity": 0,
          "NumberOfReturns": 1,
          "PointId": 0,
          "PointSourceId": 0,
          "Red": 179,
          "ReturnNumber": 1,
          "ScanAngleRank": 0,
          "ScanDirectionFlag": 0,
          "UserData": 0,
          "X": 482311.78,
          "Y": 4391069,
          "Z": 1948.34
        }
      },
      "reader": "readers.las"
    }
    

    Should I explicitly ask to have this information or is that a bug?

    opened by VSasyan 0
  • Fix building against recent (main branch) of PDAL.

    Fix building against recent (main branch) of PDAL.

    LasHeader was refactored in PDAL so that:

    scaleX(), scaleY(), scaleZ(), offsetX(), offsetY(), offsetZ()

    are now

    scale.x, scale.y, scale.z, offset.x, offset.y, offset.z

    as of PDAL/[email protected]

    opened by klassenjs 2
  • how subset works ?

    how subset works ?

    I found this in the doc :

    Entwine builds may be split into multiple subset tasks, and then be merged later with the merge command. Subset builds must contain exactly the same configuration aside from this subset field.

    Subsets are specified with a 1-based id for the task ID and an of key for the total number of tasks. The total number of tasks must be a power of 4.

    However, I don't understand if the subsetting needs the full input datasets and produce only a subset of the output dataset. Or, if by specifying correctly the BBOX of all datasets, I can do subsetting without having all input files ?

    opened by julienlau 0
Releases(2.2.0)
  • 2.2.0(Aug 4, 2021)

    This release contains some improvements to the overall workflow for generating EPT data, several new cloud storage drivers and enhancements to the existing ones, and some bug fixes.

    • entwine scan has been replaced with entwine info, which has the same general purpose but an improved format which is easier to work with
    • entwine build should be a bit more efficient in both memory and time usage
    • Detailed dimension statistics are aggregated and written into the EPT metadata
    • entwine convert has been removed in favor of ept-tools, which can convert EPT to 3D Tiles on the fly as a server or in a serverless (e.g. AWS Lambda) environment
    • cloud storage driver enhancements, additions (for example an Azure blob storage driver by @gui2dev), and bug fixes
    Source code(tar.gz)
    Source code(zip)
  • 2.1.0(Jul 23, 2019)

    Entwine 2.1 brings a variety of performance and quality-of-life improvements.

    • Indexing speed should be noticeably faster in most cases, with lower memory usage
    • Output file sizing is more consistent even across widely-varied source data
    • Default output chunk size is now a bit smaller for snappier visualization
    • New data encoding compression: Zstandard
    • Various bug fixes and improvements by users and contributors (thanks!)
    • Fewer dependencies - external dependency for JSON removed
    • Better logging information
    Source code(tar.gz)
    Source code(zip)
  • 2.0.0(Dec 19, 2018)

  • 1.3.0(Jul 3, 2018)

    This will be the final release of the black-box Entwine output for maintenance for Greyhound-required workflows. See #98.

    • Robustness fixes for reading
    • Update to PDAL's generic compression API
    • Cesium batch table support
    Source code(tar.gz)
    Source code(zip)
  • 1.2.0(Dec 4, 2017)

    • Add initial append-attributes implementation
    • Various performance enhancements, particularly for terrestrial data
    • Cloud-scaling improvements for large datasets
    • Option to preserve pre-reprojection values in the output
    • Update to PDAL 1.6
    Source code(tar.gz)
    Source code(zip)
  • 1.1.0(May 16, 2017)

    • Replace laz-perf storage with las-zip storage by default, which can significantly decrease output size
    • Add fallback to non-streaming PDAL API when a PDAL reader does not support streaming
    • Make merging faster and more resilient to spurious errors (for example S3 GET error) for large builds
    • Various minor fixes and documentation improvements
    Source code(tar.gz)
    Source code(zip)
  • 1.0.0(Mar 3, 2017)

    Initial release of Entwine - a data organization library for massive point clouds. Built on PDAL, a variety of file formats are supported. Built for parallelization, Entwine is designed to conquer datasets of hundreds of billions of points as well as desktop-scale point clouds with lossless output. Simple query access via HTTP is provided via Greyhound, as seen with Plasio and Potree. Static tileset output for Cesium 3D-Tiles is also supported.

    Source code(tar.gz)
    Source code(zip)
LIDAR(Livox Horizon) point cloud preprocessing, including point cloud filtering and point cloud feature extraction (edge points and plane points)

LIDAR(Livox Horizon) point cloud preprocessing, including point cloud filtering and point cloud feature extraction (edge points and plane points)

hongyu wang 10 Apr 12, 2022
null 235 Jun 22, 2022
personal organization utilities

orgutils: Personal Organization Utilities orgutils are a set of utilities for personal and project organization. Each program has

Seninha 5 Dec 8, 2021
A simple CHIP-8 emulator made for the purpose of studying computer organization, mainly how emulation does work.

CHIP8EMU A simple CHIP-8 emulator made for the purpose of studying computer organization, mainly how emulation does work. It was written in just a few

Patrick Cardoso 1 Nov 9, 2021
This is the massive repository for all code for the class CIS3250 Fall Semester.

========================================== Transforming Shapes Through Matrix Multiplication ========================================== Description o

null 4 Nov 25, 2021
Well I'd like to test myself how good I am before making something massive :">

Pratice-Cpp Well I'd like to test myself how good I am before making something massive :"> Before I upload something special, that I'll release in my

Harshfeudal-Coding 2 May 6, 2022
Cloud Native Data Plane (CNDP) is a collection of user space libraries to accelerate packet processing for cloud applications.

CNDP - Cloud Native Data Plane Overview Cloud Native Data Plane (CNDP) is a collection of userspace libraries for accelerating packet processing for c

Cloud Native Data Plane 19 Jun 28, 2022
An implementation on Fast Ground Segmentation for 3D LiDAR Point Cloud Based on Jump-Convolution-Process.

An implementation on "Shen Z, Liang H, Lin L, Wang Z, Huang W, Yu J. Fast Ground Segmentation for 3D LiDAR Point Cloud Based on Jump-Convolution-Process. Remote Sensing. 2021; 13(16):3239. https://doi.org/10.3390/rs13163239"

Wangxu1996 31 Jun 28, 2022
A LiDAR point cloud cluster for panoptic segmentation

Divide-and-Merge-LiDAR-Panoptic-Cluster A demo video of our method with semantic prior: More information will be coming soon! As a PhD student, I don'

YimingZhao 55 Jun 8, 2022
Ground segmentation and point cloud clustering based on CVC(Curved Voxel Clustering)

my_detection Ground segmentation and point cloud clustering based on CVC(Curved Voxel Clustering) 本项目使用设置地面坡度阈值的方法,滤除地面点,使用三维弯曲体素聚类法完成点云的聚类,包围盒参数由Apol

null 8 Jun 7, 2022
The code implemented in ROS projects a point cloud obtained by a Velodyne VLP16 3D-Lidar sensor on an image from an RGB camera.

PointCloud on Image The code implemented in ROS projects a point cloud obtained by a Velodyne VLP16 3D-Lidar sensor on an image from an RGB camera. Th

Edison Velasco Sánchez 4 Apr 21, 2022
Simple OpenGL program to visualize point cloud.

Point Cloud Viewer Simple OpenGL program to visualize point cloud. The input data files should be plain text files. screenshot on Linux: screenshot on

Tang.Anke 3 May 31, 2022
An unified library for fitting primitives from 3D point cloud data with both C++&Python API.

PrimitivesFittingLib An unified library for fitting multiple primitives from 3D point cloud data with both C++&Python API. The supported primitives ty

Yueci Deng 11 Apr 14, 2022
This code converts a point cloud obtained by a Velodyne VLP16 3D-Lidar sensor into a depth image mono16.

pc2image This code converts a point cloud obtained by a Velodyne VLP16 3D-Lidar sensor into a depth image mono16. Requisites ROS Kinetic or Melodic Ve

Edison Velasco Sánchez 6 May 18, 2022
This repository uses a ROS node to subscribe to camera (hikvision) and lidar (livox) data. After the node merges the data, it publishes the colored point cloud and displays it in rviz.

fusion-lidar-camera-ROS 一、介绍 本仓库是一个ROS工作空间,其中ws_fusion_camera/src有一个工具包color_pc ws_fusion_camera │ README.md │ └───src │ └───package: c

hongyu wang 14 Jun 15, 2022
This project is used for lidar point cloud undistortion.

livox_cloud_undistortion This project is used for lidar point cloud undistortion. During the recording process, the lidar point cloud has naturally th

livox 42 Jun 16, 2022
copc-lib provides an easy-to-use interface for reading and creating Cloud Optimized Point Clouds

copc-lib copc-lib is a library which provides an easy-to-use reader and writer interface for COPC point clouds. This project provides a complete inter

Rock Robotic 18 Jun 15, 2022
BAAF-Net - Semantic Segmentation for Real Point Cloud Scenes via Bilateral Augmentation and Adaptive Fusion (CVPR 2021)

Semantic Segmentation for Real Point Cloud Scenes via Bilateral Augmentation and Adaptive Fusion (CVPR 2021) This repository is for BAAF-Net introduce

null 82 Jun 5, 2022
DeepI2P - Image-to-Point Cloud Registration via Deep Classification. CVPR 2021

#DeepI2P: Image-to-Point Cloud Registration via Deep Classification Summary Video PyTorch implementation for our CVPR 2021 paper DeepI2P. DeepI2P solv

Li Jiaxin 121 Jun 14, 2022
A simple localization framework that can re-localize in one point-cloud map.

Livox-Localization This repository implements a point-cloud map based localization framework. The odometry information is published with FAST-LIO. And

Siyuan Huang 72 Jun 30, 2022
Point Cloud Library (PCL)

Point Cloud Library Website The new website is now online at https://pointclouds.org and is open to contributions ??️ . If you really need access to t

Point Cloud Library (PCL) 7.5k Jun 30, 2022
GA-NET: Global Attention Network for Point Cloud Semantic Segmentation

GA-NET: Global Attention Network for Point Cloud Semantic Segmentation We propose a global attention network, called GA-Net, to obtain global informat

null 3 Jan 21, 2022
GROR:A New Outlier Removal Strategy Based on Reliability of Correspondence Graph for Fast Point Cloud Registration

GROR GROR:A New Outlier Removal Strategy Based on Reliability of Correspondence Graph for Fast Point Cloud Registration About (a): correspondences gen

Pengcheng Wei 22 Jun 15, 2022
Apache Thrift is a lightweight, language-independent software stack for point-to-point RPC implementation

Apache Thrift Introduction Thrift is a lightweight, language-independent software stack for point-to-point RPC implementation. Thrift provides clean a

The Apache Software Foundation 9.2k Jun 24, 2022
The C++ REST SDK is a Microsoft project for cloud-based client-server communication in native code using a modern asynchronous C++ API design. This project aims to help C++ developers connect to and interact with services.

Welcome! The C++ REST SDK is a Microsoft project for cloud-based client-server communication in native code using a modern asynchronous C++ API design

Microsoft 6.9k Jun 22, 2022
The C++ REST SDK is a Microsoft project for cloud-based client-server communication in native code using a modern asynchronous C++ API design. This project aims to help C++ developers connect to and interact with services.

Welcome! The C++ REST SDK is a Microsoft project for cloud-based client-server communication in native code using a modern asynchronous C++ API design

Microsoft 6.9k Jun 25, 2022
Freeing the Silvercrest (Lidl/Tuya) Smart Home Gateway from the cloud.

free-your-silvercrest Freeing the Silvercrest (Lidl/Tuya) Smart Home Gateway from the cloud A collection of scripts/programs for freeing your Silvercr

null 104 Jun 21, 2022
The C++ REST SDK is a Microsoft project for cloud-based client-server communication in native code using a modern asynchronous C++ API design. This project aims to help C++ developers connect to and interact with services.

The C++ REST SDK is a Microsoft project for cloud-based client-server communication in native code using a modern asynchronous C++ API design. This project aims to help C++ developers connect to and interact with services.

Microsoft 6.9k Jun 22, 2022
Easy automated syncing between your computers and your MEGA Cloud Drive

Easy automated syncing between your computers and your MEGA Cloud Drive

Mega Limited 1.2k Jun 29, 2022