Full-body tracking in VR using AprilTag markers.



Full-body tracking in VR using AprilTag markers.

This is my second attempt at creating a full-body tracking system using fiducial markers. This should enable people to get fullbody tracking for free, using only a phone and some cardboard. It is possible to get pretty good tracking with trackers of sizes as small as 10cm and a PS eye camera of 640x480 resolution. Increasing the marker size or using a higher resolution and faster phone camera further improves tracking.


To use, you will have to make three trackers - one for each leg and one for hips. Using only leg trackers will not work in VRChat!

This version uses the much more accurate AprilTag system and includes many improvements to make the system easier to use, such as a GUI interface and a more straight forward calibration.

If you have any issues or encounter any bugs, feel free to open an issue on github or message me on discord: https://discord.gg/g2ctkXB4bb

The program can be downloaded from the releases tab.


Beatsaber demo: https://youtu.be/Akps-dH0EeA

Short setup video:

I am too bad at making actual tutorials, but I did record a short video of me setting up everything. Its not a replacement for the tutorial bellow, but it may help you understand some of the steps better.


Table of contents

We now have a discord server!


Connecting a camera

The first step is connecting a camera feed to you computer. This step is probably the most complex, since you will have to find out what works best for you. Each of the methods has its pros and cons, so try them out and see what works best. If you know of any other option, feel free to use that!

This tutorial only outlines the methods to connect a camera, and their pros and cons. If you have problems related to these, you should refer to their official tutorials. You should get it working before continuing with the tutorial.

Using a USB webcam:


  • Simple to setup
  • High quality cameras will offer good performance(1080p 60fps)


  • Most cameras have a too low resolution and too much motion blur to use effectively

If you have a USB camera, you should try that first. If tracking is too bad, you can always switch to a phone later. A ps3 eye camera will work, but just barely due to its low resolution.

To ensure the camera is working as well as it can, refer to the Start/Stop camera part.


Connect the camera to your PC and you are done! If you use a PS3 eye camera, also install the PS3 eye universal driver, not the CL eye one!

Using IP-Webcam wireless:


  • Fairly simple to setup
  • Plenty of video options


  • Requires a good wifi connection
  • Your PC and phone must be connected to the same network
  • Only for android phones

If you have your PC and android phone connected to the same router and you have a strong wifi connection on your phone, this is the option you should use.


Download the app IP Webcam from the play store. Start the app. Under video preferences->video resolution, select the resolution you wish to use. You should try to use a 4:3 aspect ratio with a resolution of around 800x600. Then, go back and click start server. Try to connect to your phone through your browser: click the help icon if you dont know how.

To ensure the camera is working as well as it can, refer to the Start/Stop camera part.

Using IP-Webcam wired:


  • Plenty of video options


  • A little harder to setup
  • May not work on all phones and computers

If you dont have a good wifi connection, but have a half recent android phone, you should try this option. It may not work, however.


For this we will use our phones network over USB feature. This is usually used to share the phones network or wifi with a computer, but if we disable wifi and mobile network on our phone, we can also use it as a direct connection between our phone and PC. First disable wifi and mobile network. Then connect your phone to your PC with a USB cable. Now enable the internet over USB option on your phone. Now, you can follow the same instructions as for the wireless one!

NOTE: Make sure that your networks are disabled or this won't work!

Using DroidCam OBS:


  • Should work on any device, including iphones
  • Wired or wireless


  • Less video options than IP-Webcam
  • Higher latency

If previous options dont work for you or you have an iphone, this is the option you should choose. It should work on any device, either wireless or wired.


First, follow the DroidCam OBS official tutorial to get the phone-OBS connection. Use the 720p video resolution to ensure there is no watermark. Then, follow the OBS VirtualCam plugin tutorial to stream to a virtual camera. The phone will now act as a regular webcam.

The trackers

How the trackers work


This is a single Apriltag marker. Each marker has a white square in the middle that is used for detection and an unique pattern of black and white for identification. This means that a single marker must always be completely visible and completely flat in order to be detected.

A tracker is composed of multiple markers facing diffrent directions, which ensures that at least one marker is visible when rotating it. They must all be fixed together and none of the markers that are included in a tracker should move or bend seperately. Tracker 0 is composed of marker id 0 and any number of extra markers of ids 1-44, tracker 1 is marker id 45 and any number of markers 46-89 etc.

The simplest version of three trackers is the following: Tracker 0 made of marker 0 and 1, tracker 1 of marker 45 and 46 and tracker 3 of marker 90 and 91. To prevent bending, they are glued to cardboard. Each of them are glued together at an 90° angle. To make them yourselves, print the Apriltag.pdf file. Refer to the below photos to cut them out and glue them properly.


Cut along the red lines, bend along the blue lines. Print the image without the lines.


How the trackers should look like from the front


How the trackers should look like from the top


Close up of whats behind the leg trackers. Notice the small piece of cardboard for support against bending and the hooks for rubber bands.


Tracker 0 will be used on our hips, and, since the rubber bands will stretch it more, should use some additional supports. I used some wire, but you can rather just use more cardboard.

Add some way of fixing the trackers to your body. I use some hooks, through which i can then fix the trackers using rubber bands.

NOTE: Make sure the pattern on the trackers is clearly visible and is not obstructed! Ensure the markers bend as little as possible!

Installing the SteamVR driver

Since version 0.4, we have an installer! Simply run install_driver.exe from driver_files and it will do these steps for you!

Inside the driver_files folder, there is a apriltagtrackers folder. Copy this folder to "Steam/steamapps/common/SteamVR/drivers". Now, open "Steam/config/steamvr.vrsettings" and, under "steamvr", add the field "activateMultipleDrivers" : true, . The section will now look something like:

"steamvr" : {
      "activateMultipleDrivers" : true,
      "installID" : "17046041383408253036",
      "lastVersionNotice" : "1.15.10",
      "lastVersionNoticeDate" : "1605567489",
      "showAdvancedSettings" : true

This will ensure that every time we launch steamvr, it will attempt to connect to the ApriltagTrackers program through our driver.

Running Apriltag Trackers

You can now run Start_ApriltagTrackers.bat! If you cant find it, make sure you downloaded the correct files from the releases tab. The first time you launch it, you may see a black console window for a few seconds. Below is a quick guide on what the buttons and parameters do.

Camera tab

This is the main tab of the program that you will use most of the time.

Start/Stop camera

Start the camera or stop it if its already running. This button will open the camera that you have written in the parameters tab. To ensure its working correctly, enable the Show preview checkbox.

If the camera fails to start, ensure that your camera is conencted and running, that you have written the correct id/address, and that you have saved the parameters after changing them. If you are using an USB webcam or OBS, try a diffrent id - it will be in the range of 0-10. You may also have to set the correct camera width/height in the parameters.

USB webcam specific parameters

We must ensure the camera is opened correctly. The preview window is not scaled - if you use a 1080p camera, the window will be over the whole screen. If its too small or has black bars around it, we have to fix it. First, enter the correct width and height parameters.

If that doesnt work, or the window still has black bars despite being larger, we have to look at the camera api. Changing it to DirectShow, so 700, should fix the resolution problem. But the camera may now have a low fps - in that case, drop the resolution down to 720p.

If the camera does not work no matter the parameters, you may open it through OBS, then stream it through the virtualcam plugin.

This is not enough to ensure good tracking however - we still have to remove as much motion blur from the camera as we can, or tracking will fail when we move. To do this, we have to lower the exposure of the camera. First step, if the camera supports 60fps, set it here. Ps eye supports 70.

Then, there are a couple of options to set exposure based on your camera. First is on your camera software - some cameras, like logitech, have their own software where you can set exposure and gain. Second, if you use the directshow api, you can set open camera settings, save, open camera - settings should show up. Third, you can check the enable bottom 3 options, and write the values there.

For the actual settings, we first have to disable autoexposure. On the settings window this will be a checkbox, in the params tab its usualy 0. For exposure, this value is usualy in exponents of 2. We want at least 8ms, which is -7, but idealy 4ms, or -8. For gain, just set it high enough that the image looks normaly bright.

Once that is set, there is one more parameter to look at - you can check the Rotate camera parameters to rotate the camera sideways. This will give you more vertical space, meaning you can be closer to the camera.

IP Webcam specific parameters

For IP Webcam, you can set resolution in the app - try to have a 4:3 aspect ratio, around 800x600 works best. Use portrait mode if you use all three trackers, and landscape

In the downloaded folder, in /utilities, there is a set_exposure.bat script. Edit it, writing in your cameras IP adress, and the desired exposure - test a few to find the one with the best brightness, but it should be under 10 ms, or 10000 ns.

Running this script will now set the exposure of your phones camera, if your phone supports it.

Calibrate camera

This will start the camera calibration sequence. Print the charuco_board.jpg from print files and place it onto a flat surface. Turn off camera preview before starting. This must only be done the first time you use this program and if you change the camera you are using.

Make sure autofocus is disabled on your camera!

A window will open with the camera feed. Every few seconds, the camera will take a picture. Move the camera around slowly, taking pictures of the charuco pattern from as many diffrent angles as possible. Move the camera up close to the pattern: you want it to take up as much image space as possible for the first 10 images!

Since v0.4, the calibration will also give feedback on how good is the current calibration. The corners will stay on the image as dots, with the color representing their reprojection error: yellow is bad calibration, purple is good calibration. Make sure most of the dots are purple before proceeding to the next step!

Another indicator of how well calibrated is the camera is the grid: it has to be nicely spread over the whole image, shaking as little as possible. Once that is done, press OK to finish calibration.

Sometimes, if the picture is too blurry or the lightning is bad, the pattern wont be detected. Make sure you have uniform lighting.

Alternatively, you can display the image on the screen and calibrate with that, but the calibration may be worse or not work at all. You can also switch to the old calibration using a chessboard in the params, in case you cant get the new board printed but you have a chessboard around.


Using an wooden chessboard for calibration. While the board is diffrent from the charuco board you will use, the process is the same.

Calibrate trackers

This will start tracker calibration. Camera should be running and calibrated. Before starting, put on your trackers in the same way you will use them in game. You must only do this step on first launch and whenever you change the trackers.

Capture the trackers with the camera by moving a tracker closer than 30cm to the camera. To add a marker to the tracker, film it while another, already added marker is seen. A green marker means its already added, yellow means it is being added, and red means it cannot add it because no already added marker is seen. Purple means the tracker is too far away from the camera. Repeat this process until all markers on the trackers are green, then click ok.

If some of the markers only have a thin blue outline, it means the markers are detected, but do not belong to any of the used trackers. You have probably set a too low number of trackers in the parameters.

If markers are not detected at all, make sure your camera is not mirrored.


Example of tracker calibration. Rotate the trackers around a bit. The axis should follow nicely no matter the number of markers that are detected for the tracker.

Connect to SteamVR

Make sure SteamVR is started. When you press this button, the program will connect. If the connection will succeed, you will se the trackers and a base station on the status window, next to the hmd and controllers.

If you quit SteamVR but not the program, you have to press this button again after starting it again. Press yes when it asks to restart the connection.


Place your camera somewhere somewhere around hip height. Since v0.3, the direction of the camera does not matter.

If the trackers do not show up on the status window, but pressing Connect to SteamVR throws no error, you have not added the activateMultipleDrivers option to the config - try doing that again, but be careful where you put it and that comma is in the correct place.


Before starting, disable SteamVR home or the camera will not show up. This button should be pressed from the virtual desktop in the SteamVR dashboard. It will start the program and open the detection window.

In the window, you can see the following things: The detected markers will have a green outline with their id written in blue. Around the markers, there will be blue circles: these represent the area that is searched during detection. If not all trackers are found, the entire vertical area is searched every second. You will see blue squares instead of circles when that happens.

Note that while trackers are rendered above everything else, so they can always be seen, the camera is rendered behind: this means that it will not be seen if its behind the dashboard or bellow the playspace square thing on the floor.

Playspace calibration

At first, the trackers will be in the floor. You need to calibrate playspace in order to use them in games. Do this by checking the Calibration mode checkbox.

A camera should now appear in SteamVR, and six fields should appear in the program. Use the first three field to align the SteamVR cameras position with your camera position IRL. You can move the values using your thumbstick. To get the position accurately, its best to place one of your controllers next to the camera and use it as reference.

When you allign position, use the next three values to align rotation. When camera is coarsly alligned, step in front of the camera so the trackers can be seen. You can now do some finer adjustmets to the camera rotation, untill the SteamVR pucks will allign nicely with your tracker IRL - the puck should lay flat on one side of the tracker. You can, again, help yourself with a controller.

When you are done, uncheck Calibration mode to save!


Below are short descriptions of the parameters you can set.

When changing parameters, make sure that you press save or they will not take effect! While some changes will work immediately after saving, you may need to restart tracking for others (press the Start button on the camera tab to stop, then press Start again to start. You do not need to recalibrate playpace after doing this).

Ip or ID of camera:

If you have a webcam or OBS, this value will be a number, usually 0, 1, or more if you have more cameras connected. Best way to figure out the correct index of the camera is to try them: Type in 0, press save, go back to Camera tab, check preview camera and press Start/Stop camera. If the correct camera shows up, great, you can go to the next step! If not, repeat the process with 1, then 2 etc until you find it.

If you use IP Webcam, you should enter your IP address, the same one as you used in your browser but ending with /video. The field should look something like this: but with some diffrent numbers.

Number of trackers:

The number of trackers you wish to use. For full body, you have to use 3. You cannot use fullbody in VRchat with just 2!

Size of markers in cm:

Measure the size of your printed markers in cm, and input the value here. Measure the white square, like this:


Rotate camera clockwise/counterclockwise:

This will flip the camera view 90° in wanted direction. This will enable you to stand closer to the camera, which is usefull if you dont have much space or you have a low resolution camera (640x480). If you use a PS eye, you should use this. You can also check both to rotate camera 180°.

Number of values for smoothing:

The algorithm uses a sliding window mean smoothing. This is the number of previous position values that will be used for the window. It ensures that tracking outliers are removed, but introduces some delay. 5 seems to be the best balance between delay and performance. In most cases, the value can be dropped to 3 to reduce latency.

Additional smoothing:

While the sliding mean does some smoothing, it is usually not enough to eliminate shaking. Aditional smoothing is done using a leaky integrator, with the formula: current_position = previous_position * value + tracked_position * (1-value).

What this means is that the parameter is between 0 and 1, 0 meaning only using tracking data without smoothing and 1 meaning using only previous data. Decreasing this parameter will increase the speed, but also increase shaking. Experiment with diffrent values to find the sweet spot.

Quad decimate:

This is the quality setting. The value can either be 1, 1.5, 2, 3 or 4. The higher is this value, the faster will the tracking be, but the range will decrease. It is dependant on the camera you use. In general, you will probably have to use 1 on 480p, 2 on 720p and 3 on 1080p. You can fine tune this parameter later based on the performance you are getting. (If you get high FPS, you can decrease it. If your trackers dont get detected well, increase it.)

Search window:

To increase performance, the algorithm only searches for trackers in a window around the position they were last seen in. This parameter sets the size of the window. Lowering this value will make the windows smaller, which makes the program run faster, but increases the chance you move the tracker outside the window which will cause it to not get tracked anymore.

The window is visualized with blue circles/boxes, based on the parameters. The tracker must be inside at least one window or it will not be tracked.

Ignore tracker 0:

This will cause tracker 0 to not be tracked. Use this if you want to replace the hip tracker with a vive puck/owotrack. Keep number of trackers at 3.

Use previous position as guess:

This parameters sets if, when estimating the 3d position of a detected tracker, the algorithm should use the previous position as a guess to help it or not. Should stay ticked unless you know what you are doing.

Use circular search window:

Search for trackers in a circular window around the previous known position or use vertical boxes. Since using circular windows is much faster, there is usually no reason not to use them.

Camera FPS:

The FPS of your camera. If you want to use a 60fps camera, set this to 60.

Camera width/height:

You can usually leave this on 0 and the program will automaticaly determine the correct width and height.

On some cameras, and usually with OBS, the camera will be opened with the wrong resolution and aspect ratio. In that case, replace these values with the correct ones.

Camera latency:

An experimental feature. In theory, this should tell SteamVR how old the positions we are sending are. It doesn't seem to do exactly that, however, but still improves latency.

You can usually set this to 1, which seems to improve latency and reduce delay.

Open camera settings:

Experimental. Should open camera settings, but doesn't seem to work. You can try it, it may work for you.

Use chessboard calibration:

Use the old chessboard calibration. Switching to new calibration is strongly recommended, but if you already have a chessboard and cant print a new pattern yet, you can check this to use the old system.

Known issues:

  • A tracker may face the wrong direction if only one marker is seen. This can be seen in the Beatsaber demo video.


  • Tutorial for reducing camera exposure on IP-Webcam
  • Virtual hip to enable use of leg trackers only


Olson, Edwin. "AprilTag: A robust and flexible visual fiducial system." 2011 IEEE International Conference on Robotics and Automation. IEEE, 2011.


WxWidgets: A Cross-Platform GUI Library


  • PreviewWindow destroyWindow hangs

    PreviewWindow destroyWindow hangs

    Finally figured out highgui. destroyWindow needs to be called on the thread that created the window. imshow creates a window, so either imshow needs to be called from the main thread, or we previously call namedWindow on the main thread and then allow imshow.

    opened by funnbot 4
  • Linux compatibility

    Linux compatibility

    Based on previous work: https://github.com/Ominitay/April-Tag-VR-FullBody-Tracker/tree/linux

    My changes address issues with getting camera started and crashes due to trying to use the UI from multiple threads (X server really doesn't like that).

    Communication with SteamVR is still TODO.

    opened by yoyobuae 3
  • Whole computer is lagging since I downloaded this exact thing

    Whole computer is lagging since I downloaded this exact thing

    Hi, I'm not used to github markdown so I don't know if It's the right way to submit issues, but since I downloaded the software, it started when using it as it lagged down my VR a lot, then it started lagging out other things such as discord or even my web browser, and even after deleting it the lag still goes on. I don't know if it's a know issue or just me ?

    opened by Lex0planet 2
  • All the trackers seem to be out of place

    All the trackers seem to be out of place

    Hey! I recently made a waist and leg trackers for AprilTag.


    Once i calibrated the cam (good calibration as far as i see) and the trackers i hopped onto SteamVR.

    Trackers seem to be recognized correctly (left leg is indeed a left leg, etc), but all of them seem to be way above me, like 1.5 to 2 meters above me.

    There should be a way to calibrate XYZ offsets for the trackers because of that, because i honestly have no idea what should i do about it. I tried repositioning the camera, restarting software, recalibrating everything (even room in SteamVR). No luck.

    Anyone knows a fix/workaround for that?

    opened by Evangeder 2
  • Refactor cmake, hopefully crossplatform now, build apriltag simultaneously.

    Refactor cmake, hopefully crossplatform now, build apriltag simultaneously.

    Pretty simple to build apriltag along with the other libs. It is a it odd that the fork of apriltag changes the cmake project name to apriltags. It also apparently does not need the include_directories and the APRILTAG_INCLUDE_DIR variable, as target_link_libraries is able to include those automatically.

    I removed the checks for whether the variables were set as they should always be set by the root cmake file as long as the submodule is fetched.

    opened by funnbot 2
  • Show what camera is being used on titlebar [Request]

    Show what camera is being used on titlebar [Request]

    Add custom title or text in the app to show what camera is being used for multi camera setups

    Its kinda confusing to find which application belongs to which camera so it would be nice to have either a title showing what camera its using or text on the application showing what camera the app is using. I kinda like it custom bc i use 2 PSeye cameras and i could name them differently

    opened by LosWheatleynDew 2
  • Fisheye Cameras/Actioncams dont work correctly

    Fisheye Cameras/Actioncams dont work correctly

    When trying to use a actioncam, the trackers are in the wrong distance to the camera. The scaling of distance to the camera is broken, so if you are close to the camera, the trackers are near you, but if you are a bit away, the trackers are way behind you

    opened by Thrillerninja 2
  • Discord Nightly Webhook

    Discord Nightly Webhook

    Only for push events, sends a webhook with the commit messages and build status. If the build successfully compiles, it will link a download aswell.

    • In discord, create read-only nightly channel then go to server settings > integrations > webhooks and create a webhook for it (use this logo if you want: https://github.githubassets.com/images/modules/logos_page/GitHub-Mark.png). Copy the webhook URL.

    • Go to this repo settings > secrets > actions > new repository secret. set name to DISCORD_NIGHTLY_WEBHOOK and value to the copied webhook url.

    • Go to https://nightly.link/ and 'Install and select your repositories', follow the steps to add it to the ATT repo. This is for linking artifacts and has an explaination for why its needed on the site.

    opened by funnbot 1
  • not responding

    not responding

    I have been trying to set this up for a few days now and have uninstalled and reinstalled everything once but when I get to the step to preview the camera and set up the trackers the program stops responding and forces me to close it I don't know if this is a problem on my end and have been working to figure out if it is but would like to hear opinions and /or likely solutions. Thank you

    opened by ShadedHemlock 1
  • Fixes for Reflection and Serialization

    Fixes for Reflection and Serialization

    Better support for static polymorphism Added a no-named field option that uses the counter as a unique id FS::Valid cast should have been const Some other fixes and formatting

    opened by funnbot 1
  • Switch from clock() to std::chrono so time works right on Linux

    Switch from clock() to std::chrono so time works right on Linux

    I was having some weird timing-related issues on Linux with stuff like the FPS counter and cam preview refresh. It turns out they all use clock() which has a different meaning on Linux and other POSIX platforms than on Windows - it measures the amount of CPU time used by the current process, rather than elapsed wall-clock time like on Windows. Since the CPU was idle a lot of the time, the code thought a lot less time had passed than in reality and stuff broke. Using std::chrono::steady_clock from the C++ standard library seems to fix this and should hopefully work on all current platforms.

    opened by makomk 1
  • Allow For Filtering of Existing VMC Protocol, with added positional tracking.

    Allow For Filtering of Existing VMC Protocol, with added positional tracking.

    I was thinking that this software works amazingly well for something that requires line of sight. But what if we were to try to integrate the motion stream with slime trackers?

    The goal of this added feature would be to correct in yaw values of VMC protocol from a stream of information coming from the SlimeVR Server app. Slime trackers work extremely well, except for the gyroscopic drift that occurs over time. Mixing these two systems would allow us to post the QR codes on-top of the slime trackers themselves, which would allow for basically seamlessly interpolated as well as positionally tracked anchors without the cost of laser tracking.

    I would love to hear back on this!

    opened by Kyvarus 0
  • please provide installer.py not only the install_driver.exe in release pack

    please provide installer.py not only the install_driver.exe in release pack

    if there is any chinese character in steamvr.vrsettings, it raises an exception like this.


    Traceback (most recent call last):
      File "installer.py", line 60, in <module>
      File "json\__init__.py", line 293, in load
    UnicodeDecodeError: 'gbk' codec can't decode byte 0xac in position 1031: illegal multibyte sequence
    [13936] Failed to execute script installer

    the exception occour on line 59 in installer.py

    with open(config) as f:
        config_data = load(f)

    we need change these lines to fix it as below

    with open(config,encoding="utf-8") as f:
        config_data = load(f)
    opened by CrystalRays 1
  • [windows] OpenCV VideoCapture SEH exception with some backends

    [windows] OpenCV VideoCapture SEH exception with some backends

    From my testing the obs-virtualcam dll that gets runtime loaded can throw SEH exceptions which cannot be caught, although considering how frequently people experience this, its likely more backends can aswell.

    The only way to catch SEH exceptions in C++ is to compile with the /EHa option, while /EHsc is the default for most C++ apps, as /EHa can introduce some performance and binary size issues.

    This leaves the alternative of spawning a separate process to test the camera address, before the main process does. A small exe distributed in the utilities folder will be included on windows builds, it only needs to link with OpenCV, and will attempt to open the hardware index, Using CreateProcess,the main process will then read the exit code and notify the user that the camera can't be opened with this address.

    opened by funnbot 0
  • v0.7.1(Aug 28, 2022)

    An update to version 0.7 with a few bug fixes and a new marker library.

    • Fix to bug that caused playspace calibration to reset instead of save when reaching the 60 second timeout
    • Fixed folder structure on linux build to prevent too many intermetidate folders
    • Added marker library to use with dodecahedron trackers

    Note that dodecahedron trackers are still failry experimental, so you should only attempt making them if you are already familiar with ATT. With imperfect calibration, they will perform worse than regular trackers.

    If you still wish to experiment with them, the necessary files are pinned in the #dev-talk channel on the discord, where most of information about them is. Note that you will also have to change the "markers per tracker" parameter directly in config.yaml from 45 to 11.


    • Fix linux artifacts folder structure by @funnbot in https://github.com/ju1ce/April-Tag-VR-FullBody-Tracker/pull/85
    • Add tagCustom29h10 marker library by @yoyobuae in https://github.com/ju1ce/April-Tag-VR-FullBody-Tracker/pull/86
    • Fix calib timeout by @funnbot in https://github.com/ju1ce/April-Tag-VR-FullBody-Tracker/pull/84

    Full Changelog: https://github.com/ju1ce/April-Tag-VR-FullBody-Tracker/compare/v0.7.0...v0.7.1

    Source code(tar.gz)
    Source code(zip)
    AprilTagTrackers-Linux-v0.7.1.tar.zip(58.36 MB)
    AprilTagTrackers-Windows-v0.7.1.zip(14.43 MB)
  • v0.7.0(Aug 8, 2022)

    After a few weeks of delays due to bug fixes, the full version is finaly released! This version features a bunch of QOL improvements, bug fixes, improved GUI and a linux build!

    • The release now includes a Linux build
    • Added support for PS Eyes directly (using Camera API of 9100)
    • Added window naming for easier management of multiple camera setups
    • Changes to driver installation to make it easier to install
    • Smoothing can now be disabled completely by setting Smoothing window to 0
    • Fixed distance from camera resetting when calibrating playspace
    • Added option to mirror camera

    The format of the config files has changed, so recalibration of everything is necessary when upgrading. The driver is also new, upgrade it as well!

    Known Issues:

    • On linux, the driver installer does not enable multiple drivers in steamvr config, so it must be done manualy.
    • The file structure of the linux download contains too many unecessary folders
    • When calibration mode times out after 60 seconds, the calibration gets reset instead of saved
    Source code(tar.gz)
    Source code(zip)
    AprilTagTrackers-Linux-v0.7.0.zip(58.36 MB)
    AprilTagTrackers-Windows-v0.7.0.zip(14.44 MB)
  • v0.7-beta-3(Jul 27, 2022)

    This beta includes some more fixes based on feedback from the previous one.

    • The set_exposure.bat script has been added back to the utilities folder, which should help those using IP Webcam set exposure easier
    • Fixed the driver installer not enabling multiple drivers, causing the driver to not work
    • Some more fixes to the build system

    Known issues:

    • The fix to the driver installer has not been done for linux yet, meaning it has to be done manualy.
    Source code(tar.gz)
    Source code(zip)
    AprilTagTrackers-Linux-v0.7-beta-3.zip(58.36 MB)
    AprilTagTrackers-Windows-v0.7-beta-3.zip(14.44 MB)
  • v0.7-beta-2(Jul 15, 2022)

    This beta focuses on some more build changes, a fix to IP Webcam not working, and some fixes to Gstreamer backend on linux.

    • The build system was changed to vcpkg
    • Using IP webcam for the camera now works
    • Using Gstreamer backend on linux should now work better

    Known issues:

    • The current driver installer does not enable activating multiple drivers in the SteamVR config. If this is your first time installing ATT, this may cause trackers to not appear inside SteamVR. This has been fixed on the latest nightly build, available on the discord server.
    • You can also fix the above issue by manualy enabling the setting, instructions can be found in the "Installing the driver" wiki page.
    Source code(tar.gz)
    Source code(zip)
    AprilTagTrackers-Linux-v0.7-beta-2.zip(58.02 MB)
    AprilTagTrackers-Windows-v0.7-beta-2.zip(14.10 MB)
  • v0.7-beta(Jul 1, 2022)

    This version focuses on project reformatting for easier build and development, Linux support and changes to ease of use:

    • The release now includes a Linux build
    • Added support for PS Eyes directly (using Camera API of 9100)
    • Added window naming for easier management of multiple camera setups
    • Changes to driver installation to make it easier to install
    • Smoothing can now be disabled by setting Smoothing window to 0
    • Fixed distance from camera resetting when calibrating playspace
    • Added option to mirror camera

    The format of the config files has changed, so recalibration of everything is necessary when upgrading.

    Full Changelog: https://github.com/ju1ce/April-Tag-VR-FullBody-Tracker/compare/v0.6...v0.7-beta


    • Connecting to IP Webcam seems to be broken
    Source code(tar.gz)
    Source code(zip)
    AprilTagTrackers-Linux-v0.7-beta.zip(32.89 MB)
    AprilTagTrackers-Windows-v0.7-beta.zip(24.20 MB)
  • v0.6(Dec 2, 2021)

    After a few months of beta releases, ATT seems to be stable enough for a full release! This is the exact same version as the v0.5.5, but released as a full release instead of pre-release.

    New additions, compared to the old v0.4 release:

    • New smoothing algorithm, which reduces shaking and delay
    • Support for multiple cameras
    • Easier camera calibration
    • Easier playspace calibration with controllers

    For more information on the changes, check the changelogs for the beta releases.

    The tutorial has been moved to the wiki with far more information, and a new setup video has been posted to reddit, dont forget to check them out!

    Source code(tar.gz)
    Source code(zip)
    ApriltagTrackers-v0.6.zip(93.03 MB)
  • v0.5.5-beta(Oct 17, 2021)

    This beta focuses mostly on a chinese translation and adding licenses to the program.

    • Thanks to @apriltagtrackers-cn, the program has been translated into chinese! The language can be selected in the params.
    • A tab has been added for all the licenses that must legaly ship with the program.
    • Added an option to disable the out window in order to save resources and make high fps cameras work better.
    • During playspace calibration, distance now also saves rather than resetting every time.
    • Fixed bug that caused crash if preview window was opened during camera calibration
    • Fixed bug where trackers would sometimes dissapear and fail to reapear, even through the axis are seen in the out window
    Source code(tar.gz)
    Source code(zip)
    ApriltagTrackers-v0.5.5.Beta.zip(93.03 MB)
  • v0.5.4-beta(Sep 9, 2021)

    This beta adds some new features, bugfixes, and a slight UI update.

    • Added an option to use Aruco 4x4 markers
    • Added a depth smoothing option
    • Added additional smoothing to the driver
    • Parameters were grouped better and some were renamed to better represent what they are
    • Fixed a bug where calibration with controllers didnt work correctly on lighthouse systems
    • Fixed a bug where the camera would fly off during multiple camera calibration refinement
    Source code(tar.gz)
    Source code(zip)
    ApriltagTrackers-v0.5.4.Beta.zip(93.02 MB)
  • v0.5.3-beta(Aug 25, 2021)

    New beta release, containing some more features to make tracking smoother! This should finalize the multiple cameras update and allow you to calibrate their playspaces perfectly.

    • Added option to automaticaly refine one playspace calibration to another. A rough calibration should still be done manualy!
    • Added option to lock camera height during calibration. Should help to recalibrate quicker if your playspace changes on quest.
    • Added an option to spawn trackers in the middle of markers, rather than at the main marker.
    • Added some warnings to parameters that may break detection
    • Calibration mode will exit automaticaly if no button is pressed for 60 seconds
    Source code(tar.gz)
    Source code(zip)
    ApriltagTrackers-v0.5.3.Beta.zip(93.16 MB)
  • v0.5.2.1-beta(Aug 7, 2021)

  • v0.5.2-beta(Aug 6, 2021)

    New beta release, consisting of some more calibration QOL fixes and the full multiple camera implementation!

    • Closing steamvr will now no longer close ATT (if camera is opened)
    • ATT instances will now help each other through the driver, making multiple cameras run smoother
    • Camera calibration will now throw away any frames with big reprojection error
    • During playspace calibration, you can now calibrate distance from camera along with rotation
    • Aditional smoothing parameter no longer represents maximum number of saved values, but rather their maximum time.
    Source code(tar.gz)
    Source code(zip)
    ApriltagTrackers-v0.5.2.Beta.zip(93.21 MB)
  • v0.5.1-beta(Jul 14, 2021)

    New beta release, mostly consisting of some calibration QOL changes and a fix to a driver bug.

    • Camera FPS and resolution is now shown on camera preview
    • Warnings if camera calibration seems to be bad
    • Can now move and rotate camera with your left controller during playspace calibration (trigger to move, grip to rotate)
    • Trackers should no longer cause controllers to stop working
    Source code(tar.gz)
    Source code(zip)
    ApriltagTrackers-v0.5.1.Beta.zip(93.14 MB)
  • v0.5-beta(Jul 9, 2021)

    This is a beta release of the new linear interpolation smoothing, which also allows the use of multiple cameras. It also needs a driver reinstall, which is provided.

    • "Additional smoothing" parameter now defines number of values to use for linear interpolation. 10 is a good starting point.
    • "Camera latency" should now work as expected and noticably reduces delay.
    • Added some options for a virtual hip

    For additional information, check the discord!

    Source code(tar.gz)
    Source code(zip)
    ApriltagTrackers-v0.5-Beta.zip(92.78 MB)
  • v0.4(Apr 2, 2021)

  • v0.3.1(Feb 18, 2021)

    Some bug fixes

    • Fixed the camera having huge exposure and low fps bug
    • Moved some not needed parameters out of the gui (they are still accessable through the params.yaml file)
    • Added some test hidden parameters only accessable from the params.yaml file:
      • trackerCalibDistance: increase the maximum distance at which trackers will calibrate
      • cameraCalibSamples: increase the number of pictures you take when calibrating camera
      • circularMarkers: use the TagCircle21h7 family of tags instead of the default Standard ones
    Source code(tar.gz)
    Source code(zip)
    ApriltagTrackers-v0.3.1.zip(33.48 MB)
  • v0.3(Feb 14, 2021)

    • Driver has been reworked and will require a reinstall
    • New playspace calibration system that should be simpler to use and more accurate
    • New camera calibration system that should be more accurate
    • Trackers must now be closer than 30cm to the camera to calibrate
    • Added more help windows and tooltips for parameters
    • Using SteamVR interpolation, which should improve smoothness and latency
    • Make some error messages more helpful
    Source code(tar.gz)
    Source code(zip)
    ApriltagTrackers-v0.3.zip(33.48 MB)
  • v0.2(Dec 18, 2020)

  • v0.1(Nov 13, 2020)

Open source Altium Database Library with over 147,000 high quality components and full 3d models.

Open source Altium Database Library with over 147,000 high quality components and full 3d models.

Mark 1.3k Dec 3, 2022
A set of open c++ game development tools that are lightweight, easy-to-integrate and free to use. Currently hosting a magicavoxel .vox full scene loader.

open game tools Open game tools is a set of unencumbered, free, lightweight, easy-to-integrate tools for use in game development. So far it contains:

null 288 Nov 26, 2022
RGL - 3D visualization device system for R using OpenGL

RGL - 3D visualization device system for R using OpenGL INTRODUCTION The RGL package is a visualization device system for R, using OpenGL or WebGL as

null 67 Dec 2, 2022
SVG animation from multiple SVGs or single GIF using tracer

svgasm svgasm is a proof-of-concept SVG assembler to generate a self-contained animated SVG file from multiple still SVG files with CSS keyframes anim

Tom Kwok 186 Nov 22, 2022
Minimal pathtracer using Vulkan RayTracing

Single File Vulkan Pathtracing Minimal pathtracer using Vulkan RayTracing Environment Vulkan SDK GPU / Driver that support Vulkan Ray Tracin

Yuki Nishidate 28 Nov 26, 2022
Axel Gneiting 1.4k Dec 1, 2022
A 2d Graphing Calculator using OpenGL

glGraph A 2d Graphing Calculator using Modern OpenGL Demo glGraph.mp4 Information This has only been tested on Fedora 34, it should work on other OS's

Nathan Medros 16 Nov 26, 2022
Sampling Clear Sky Models using Truncated Gaussian Mixtures

Sampling Clear Sky Models using Truncated Gaussian Mixtures Overview This repository contains the source code that is part of the supplemental materia

Computer Graphics AUEB 12 Aug 9, 2022
A toy renderer written in C using Vulkan to perform real-time ray tracing research.

This is a toy renderer written in C using Vulkan. It is intentionally minimalist. It has been developed and used for the papers "BRDF Importance Sampl

Christoph Peters 284 Nov 30, 2022
This is a openGL cube demo program. It was made as a tech demo using PVR_PSP2 Driver layer GPU libraries.

OpenGL Cube Demo using PVR_PSP2 Driver layer GPU libraries This is a openGL cube demo program. It was made as a tech demo using PVR_PSP2 Driver layer

David Cantu 5 Oct 31, 2021
Yet another Chip-8 interpreter, this time written in C++ using GLFW and OpenGL as its graphics library 💻

Yet another Chip-8 interpreter, but this time with a beautiful interface ??

Akshit Garg 32 Nov 21, 2022
Lab2: using a physical embedded systems to interact with virtual embedded systems.

Lab2: dotDevice EmSys Autumn 2021 In this lab you will use your TinyPico to interact with a virtual embedded system. Current Virtual Lab URL: [http://

Shane Fleming 1 Oct 20, 2021
Canny edge detection, one of the efficient edge detection algorithms is implemented on a Zedboard FPGA using verilog

In this project, Canny edge detection, one of the efficient edge detection algorithms is implemented on a Zedboard FPGA using verilog. The input image is stored on a PC and fed to the FPGA. The output processed image is displayed on a VGA monitor.

Jeffrey Samuel 3 Nov 16, 2022
A Sudoku solver made in C++ using SDL for graphics.

sudoku_solver A Sudoku solver made in C++ using SDL for graphics. What is a Sudoku puzzle? Sudoku puzzles have been around for a very long time, origi

null 1 Nov 12, 2021
runcat system tray on Linux (using libappindicator)

runcat-tray Is a runcat port for Linux using libappindicator

Yongsheng Xu 20 Oct 31, 2022
simple fdtd using vulkan, omp or single thread

fdtd simple fdtd using vulkan, omp or single thread example how to build first clone the repo with: git clone https://github.com/nikisalli/fdtd.git up

Nik 5 Nov 12, 2022
OpenGL Object Loading can load virtually every 3d.obj file you can find on the internet, without using another object loading library

OpenGL Object Loading can load virtually every 3d.obj file you can find on the internet, without using another object loading library (assimp for example). The program can load Object with 12M+ triangles and more

Phan Sang 14 Dec 1, 2022
This Project Implement an interactive camera for 3D model using Quaternion. It have some advantages over eulerian camera like no gimbal lock and faster to compute.

Quaternion-Camera This Project Implement an interactive camera for 3D model using Quaternion. It have some advantages over eulerian camera like no gim

Phan Sang 8 Nov 10, 2022
Android studio native project template using cross platform raylib graphics library.

rayturbo Android studio native project template using cross platform raylib graphics library. https://www.raylib.com/ This project use Android Studio

Ciapas Linux 10 Mar 5, 2022