A programmable and highly maneuverable robotic cat for STEM education and AI-enhanced services.

Overview

OpenCat

OpenCat is the open-source Arduino and Raspberry Pi-based robotic pet framework developed by Petoi, the maker of futuristic programmable robotic pets.

The goal is to foster collaboration in quadruped robotic research and development of agile and affordable quadruped robot pets, and to bring STEM concepts to the mass and inspire newcomers (including a lot of kids) to join the robotic AI revolution.

The project is still a complex system only for skilled makers, yet we want to share our work with the community by mass production and bringing down the hardware and software cost.

OpenCat has been deployed on Petoi's palm-sized, realistic lifelike robot cat Nybble (www.igg.me/at/nybble) and robot dog Bittle (www.igg.me/at/bittle) that can run, walk, and self-balance like a real animal. We've successfully crowdfunded these two robots and bringing them to the market. With our customized Arduino board coordinating all instinctive and sophisticated movements, one can clip on various sensors to bring in perception and inject artificial intelligence capabilities by mounting a Raspberry Pi or other AI chips through wired/wireless connections.

This repository works on both Nybble and Bittle controled by NyBoard based on ATmega328P. To run the code on our robot models, first change the model and board definition in OpenCat.h, then upload WriteInstinct.ino.

#include "InstinctBittle.h" //activate the correct header file according to your model
//#include "InstinctNybble.h"

//#define NyBoard_V0_1
//#define NyBoard_V0_2
#define NyBoard_V1_0

In the serial monitor, set No line ending and baudrate as 115200 (or 57600 for NyBoard V0_*). Enter three capitalized Y after the prompts and wait for the MPU to calibrate. Then upload OpenCat.ino as the main functional code.

For updates:

  • star this repository to receive timely notifications on changes.
  • visit www.petoi.com and subscribe to our official newsletters for project announcements.
  • follow us on Twitter, Instagram, and YouTube channel for fun videos and community activities.

The old repository for OpenCat is too redundent with large image logs and will be obsolete after we add compatibility notes in the documentation.

Comments
  • Getting IMU data

    Getting IMU data

    Hi @borntoleave,

    Is it possible to retrieve IMU data from Bittle programmatically? I want to get odometry for SLAM package integration.

    Thanks, Sergey

    opened by sskorol 21
  • OpenCat.ino Sketch to big to compile

    OpenCat.ino Sketch to big to compile

    Hi,

    downloaded OpenCat.ino. It does not compile. Error is:

    Sketch uses 33764 bytes (104%) of program storage space. Maximum is 32256 bytes. Global variables use 1344 bytes (65%) of dynamic memory, leaving 704 bytes for local variables. Maximum is 2048 bytes. Sketch too big; see http://www.arduino.cc/en/Guide/Troubleshooting#size for tips on reducing it.

    Updated libraries to the latest.

    Any idea?

    BR Heiko

    opened by loeweh 14
  • Bittle sometimes doesn't physically respond to Serial commands

    Bittle sometimes doesn't physically respond to Serial commands

    Robot: Bittle Hardware: NyBoard v1.0, RPi Zero 2 W Software: Ubuntu Server 20.04, Python 3.8

    Steps:

    1. Run ardSerial.py
    2. Send kbalance command
    3. Send krest command
    4. Send kbalance command
    5. Send krest command

    Expected: all the commands are successfully executed, Bittle performed the requested actions. Actual: the last krest is executed, but Bittle didn't perform any action.

    Check the video to see it in action.

    Note that it's not related to krest command only. I observed the same behavior for any command. Everything may work smoothly, and then in a moment, some command is just skipped.

    Also note that seems like I'm not alone here. I investigated existing GitHub repos for Bittle and found this one for the gamepad teleop. There's a list of know issues at the end.

    Screenshot from 2022-01-14 22-34-37

    The second one seems very close to what I'm talking about. And this repo uses pyBittle library as a dependency for Serial communication.

    Anyway, it seems like there's a bug. So would be greatly appreciated if you can take a look @borntoleave @JasonWong08.

    Thanks, Sergey

    opened by sskorol 12
  • New serial code doesn't work with RPi hat

    New serial code doesn't work with RPi hat

    Hi @borntoleave @JasonWong08,

    I noticed a recent update in serial code. Wondering if you have tested it with an RPi hat?

    It doesn't work for me. First of all, it incorrectly detects a serial port: /dev/ttyAMA0 is wrong for RPi. There should always be ttyS0.

    I updated it manually in code, but serial behavior still seems incorrect to me:

    • Some commands work, the others - not. E.g. I can run ksit, but further kbalance doesn't work or vice versa.
    • NyBoard is always stuck after 1 or 2 commands and requires a reset.
    • Sometimes I see response parsing exceptions for the correct commands, e.g. here when we try to extract a 0-element from empty object.
    • NyBoard is totally unstable: it may beep several times before or after the command's execution but there's no response at all.

    Note that I can still control Bittle with IR until the board is stuck on serial.

    I can record a video of course. But it's not really helpful when the board is stuck in 1 command and Bittle becomes unresponsive.

    Any thoughts or comments on how to make the serial interface stable with an RPi hat would be greatly appreciated.

    Hardware / Environment

    • Bittle / NyBoard_V1_0
    • RPi Zero 2W / model A+
    • x64 Ubuntu 20.04
    opened by sskorol 10
  • Nybble - Adding new skill: need to move zero from EEPROM to PROGMEM

    Nybble - Adding new skill: need to move zero from EEPROM to PROGMEM

    Not sure if this is a bug or not, but thought I'd mention and let you decide :)

    Last night, while trying to write custom zero skill for Nybble, it didn't seem to work.

    The fix was easy: in my local branch, inside InstinctNybble.h:

    1. I moved zero to progmemPointer[] array.
    2. Revised it to "zeroN" in skillNameWithType array.
    3. Re-ran WriteInstinct.ino to update the skills
    4. Ran and uploaded my OpenCat.ino copy.

    And to gain a bit more space in Program memory I commented out hs, hs1, hs2 skills.

    Now, I can just tweak the zero skill and upload the code... no need to re-run WriteInstinct. But I needed it the first time just to move zero out of EEPROM and into Program memory (if I understand right).

    opened by Troy-Chard 10
  • Application crashes when using MacOS with M1

    Application crashes when using MacOS with M1

    I'm experiencing issues launching the Petoi Desktop App with a Macbook Pro with M1 CPU.

    Launching from the Launchpad I only got this error message:

    immagine

    Launching from the terminal I got this traceback:

    Traceback (most recent call last):
      File "/Applications/Petoi Desktop App.app/Contents/Resources/__boot__.py", line 89, in _recipes_pil_prescript
        import Image
    ModuleNotFoundError: No module named 'Image'
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/Applications/Petoi Desktop App.app/Contents/Resources/__boot__.py", line 135, in <module>
        _recipes_pil_prescript(['DdsImagePlugin', 'FitsStubImagePlugin', 'EpsImagePlugin', 'WmfImagePlugin', 'PdfImagePlugin', 'JpegImagePlugin', 'DcxImagePlugin', 'GbrImagePlugin', 'FpxImagePlugin', 'PalmImagePlugin', 'IptcImagePlugin', 'PsdImagePlugin', 'SunImagePlugin', 'MpegImagePlugin', 'IcoImagePlugin', 'BmpImagePlugin', 'IcnsImagePlugin', 'TgaImagePlugin', 'GifImagePlugin', 'FliImagePlugin', 'Jpeg2KImagePlugin', 'SgiImagePlugin', 'Hdf5StubImagePlugin', 'CurImagePlugin', 'PixarImagePlugin', 'BufrStubImagePlugin', 'XpmImagePlugin', 'MicImagePlugin', 'PngImagePlugin', 'BlpImagePlugin', 'WebPImagePlugin', 'PcdImagePlugin', 'FtexImagePlugin', 'MpoImagePlugin', 'MspImagePlugin', 'ImtImagePlugin', 'PpmImagePlugin', 'SpiderImagePlugin', 'PcxImagePlugin', 'GribStubImagePlugin', 'McIdasImagePlugin', 'XbmImagePlugin', 'ImImagePlugin', 'XVThumbImagePlugin', 'TiffImagePlugin'])
      File "/Applications/Petoi Desktop App.app/Contents/Resources/__boot__.py", line 93, in _recipes_pil_prescript
        from PIL import Image
      File "<frozen zipimport>", line 259, in load_module
      File "PIL/Image.pyc", line 89, in <module>
      File "<frozen zipimport>", line 259, in load_module
      File "PIL/_imaging.pyc", line 14, in <module>
      File "PIL/_imaging.pyc", line 10, in __load
      File "imp.pyc", line 342, in load_dynamic
    ImportError: dlopen(/Applications/Petoi Desktop App.app/Contents/Resources/lib/python3.9/lib-dynload/PIL/_imaging.so, 0x0002): Library not loaded: '@rpath/libtiff.5.dylib'
      Referenced from: '/Applications/Petoi Desktop App.app/Contents/Resources/lib/python3.9/lib-dynload/PIL/_imaging.so'
      Reason: tried: '/opt/anaconda3/envs/simpleUI/lib/libtiff.5.dylib' (no such file), '/opt/anaconda3/envs/simpleUI/lib/libtiff.5.dylib' (no such file), '/Applications/Petoi Desktop App.app/Contents/Resources/lib/python3.9/lib-dynload/PIL/../../../libtiff.5.dylib' (no such file), '/opt/anaconda3/envs/simpleUI/lib/libtiff.5.dylib' (no such file), '/opt/anaconda3/envs/simpleUI/lib/libtiff.5.dylib' (no such file), '/Applications/Petoi Desktop App.app/Contents/Resources/lib/python3.9/lib-dynload/PIL/../../../libtiff.5.dylib' (no such file), '/opt/anaconda3/envs/simpleUI/lib/libtiff.5.dylib' (no such file), '/Applications/Petoi Desktop App.app/Contents/Frameworks/libtiff.5.dylib' (no such file), '/Applications/Petoi Desktop App.app/Contents/MacOS/../Frameworks/libtiff.5.dylib' (no such file), '/usr/local/lib/libtiff.5.dylib' (no such file), '/usr/lib/libtiff.5.dylib' (no such file)
    2022-08-29 10:16:05.365 Petoi Desktop App[7205:38695] Launch error
    2022-08-29 10:16:05.365 Petoi Desktop App[7205:38695] Launch error
    See the py2app website for debugging launch issues
    exit
    

    I also tried directly from source code cloning this repo and starting pyUI/UI.py after have installed pyserial and pillow packages and the application starts showing the small window with the three buttons but then, when I choose Skill Composer it crashes with a Segmentation fault error when I click on the Behavior or Export buttons.

    Is there something I'm missing?

    Thanks in advance for the help

    opened by ioulosve 3
  • Is it possible to remote-control precise Bittle gaits?

    Is it possible to remote-control precise Bittle gaits?

    I've seen examples such as pyBittle of sending pre-programmed gaits to the Bittle via WiFi or Bluetooth. I'm an AI researcher looking to instead send precise velocities and torques to the Bittle for exact joint control. I also need to capture the Bittle's accelerometer and gyroscope data.

    Is there example code anywhere for how to achieve both of these goals?

    1. Send precise commands
    2. Receive Bittle metadata

    Thanks for any help!

    opened by slerman12 3
  • Fixes/Tidy ups for testUltrasonic

    Fixes/Tidy ups for testUltrasonic

    The testUltrasonic.ino file was not compatible with the "Grove Ultrasonic Ranger` which I received in the Bittle Sensor Pack.

    The difference was that the "Grove Ultrasonic Ranger" only uses one pin for input/output.

    Changes:

    • Support single pin ultrasonic ranging
    • Tidy up of code
    • Use physics based #defines to make it clear why we are using different constants (e.g. speed of sound)

    Testing

    I used a super sophisticated test setup 😜

    image

    Tested well at 15cm, 10cm and 5cm

    opened by hoani 3
  • Please make a stable release

    Please make a stable release

    @borntoleave Since this repo is so frequently updated our code sometimes break when you push a new commit. It would be great if you could make a stable release.

    opened by ImFstAsFckBoi 2
  • Understanding the format of a behavior: trigger axis

    Understanding the format of a behavior: trigger axis

    The bf behavior is defined as: https://github.com/PetoiCamp/OpenCat/blob/f1013cece692d6ff9d86577a7216416f9587aa2d/WriteInstinct/InstinctBittle.h#L599-L610

    If I understand correctly, the fifth frame is triggered by a pitch angle of -10 degrees. The sixth frame seems to have the trigger axis set to zero (i.e. no trigger axis) but there is an angle value. It seems that the 127 will never be read because the axis is tested first and you can't trigger on yaw: https://github.com/PetoiCamp/OpenCat/blob/f1013cece692d6ff9d86577a7216416f9587aa2d/OpenCat.ino#L740-L742

    Is this meant to be ignored, or an error, or do I misinterpret the format?

    opened by durka 2
  • Divide by zero

    Divide by zero

    File WriteInstinct/OpenCat.h line 926: void calibratedPWM(byte i, float angle, float speedRatio = 0) { All active calls to calibratedPWM() only have 2 arguments, so the default value of speedRatio (0) is used unguarded on line 924: byte steps = byte(round(abs(duty - duty0) / 1.0/degreeStep/ / speedRatio)); //default speed is 1 degree per step triggering a divide by zero.

    opened by DaveB1620 2
  • Running the Firmware Uploader on a Raspberry PI does not work

    Running the Firmware Uploader on a Raspberry PI does not work

    I tried to run the firmware uploader from the UI on a raspberry pi, but it isn't able to connect to the correct port (/dev/ttyS0). The calibrator however works on the raspberry pi. After looking at the code we found that the communication is established differently in the calibrator and in the firmware uploader. In the calibrator, ardSerial is used while in the firmware uploader serial.tools.list_ports is used. Is there a way in which we can use the ardSerial in the firmware uploader? Thanks for the help.

    opened by elgohary12 1
Releases(1.0.1)
Owner
Petoi LLC
Petoi LLC
IA-LIO-SAM is enhanced LIO-SAM using Intensity and Ambient channel from OUSTER LiDAR.

IA-LIO-SAM Construction monitoring is one of the key modules in smart construction. Unlike structured urban environment, construction site mapping is

Kevin Jung 82 Dec 13, 2022
Triton - a language and compiler for writing highly efficient custom Deep-Learning primitives.

Triton - a language and compiler for writing highly efficient custom Deep-Learning primitives.

OpenAI 4.6k Dec 26, 2022
monolish: MONOlithic Liner equation Solvers for Highly-parallel architecture

monolish: MONOlithic LIner equation Solvers for Highly-parallel architecture monolish is a linear equation solver library that monolithically fuses va

RICOS Co. Ltd. 179 Dec 21, 2022
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

eXtreme Gradient Boosting Community | Documentation | Resources | Contributors | Release Notes XGBoost is an optimized distributed gradient boosting l

Distributed (Deep) Machine Learning Community 23.6k Jan 3, 2023
High performance, easy-to-use, and scalable machine learning (ML) package, including linear model (LR), factorization machines (FM), and field-aware factorization machines (FFM) for Python and CLI interface.

What is xLearn? xLearn is a high performance, easy-to-use, and scalable machine learning package that contains linear model (LR), factorization machin

Chao Ma 3k Dec 23, 2022
Video, Image and GIF upscale/enlarge(Super-Resolution) and Video frame interpolation. Achieved with Waifu2x, SRMD, RealSR, Anime4K, RIFE, CAIN, DAIN and ACNet.

Video, Image and GIF upscale/enlarge(Super-Resolution) and Video frame interpolation. Achieved with Waifu2x, SRMD, RealSR, Anime4K, RIFE, CAIN, DAIN and ACNet.

Aaron Feng 8.7k Dec 31, 2022
ORB-SLAM3 is the first real-time SLAM library able to perform Visual, Visual-Inertial and Multi-Map SLAM with monocular, stereo and RGB-D cameras, using pin-hole and fisheye lens models.

Just to test for my research, and I add coordinate transformation to evaluate the ORB_SLAM3. Only applied in research, and respect the authors' all work.

B.X.W 5 Jul 11, 2022
Cartographer is a system that provides real-time simultaneous localization and mapping (SLAM) in 2D and 3D across multiple platforms and sensor configurations.

Cartographer Purpose Cartographer is a system that provides real-time simultaneous localization and mapping (SLAM) in 2D and 3D across multiple platfo

Cartographer 6.3k Jan 4, 2023
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for ranking, classification, regression and other machine learning tasks for Python, R, Java, C++. Supports computation on CPU and GPU.

Website | Documentation | Tutorials | Installation | Release Notes CatBoost is a machine learning method based on gradient boosting over decision tree

CatBoost 6.9k Dec 31, 2022
Deep Learning API and Server in C++11 support for Caffe, Caffe2, PyTorch,TensorRT, Dlib, NCNN, Tensorflow, XGBoost and TSNE

Open Source Deep Learning Server & API DeepDetect (https://www.deepdetect.com/) is a machine learning API and server written in C++11. It makes state

JoliBrain 2.4k Dec 30, 2022
In-situ data analyses and machine learning with OpenFOAM and Python

PythonFOAM: In-situ data analyses with OpenFOAM and Python Using Python modules for in-situ data analytics with OpenFOAM 8. NOTE that this is NOT PyFO

Argonne Leadership Computing Facility - ALCF 129 Dec 29, 2022
CTranslate2 is a fast inference engine for OpenNMT-py and OpenNMT-tf models supporting both CPU and GPU executio

CTranslate2 is a fast inference engine for OpenNMT-py and OpenNMT-tf models supporting both CPU and GPU execution. The goal is to provide comprehensive inference features and be the most efficient and cost-effective solution to deploy standard neural machine translation systems such as Transformer models.

OpenNMT 395 Jan 2, 2023
VNOpenAI 31 Dec 26, 2022
Fast and robust template matching with majority neighbour similarity and annulus projection transformation

A-MNS_TemplateMatching This is the official code for the PatternRecognition2020 paper: Fast and robust template matching with majority neighbour simil

Layjuns 22 Dec 30, 2022
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.

a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.

Tencent 1.2k Dec 29, 2022
New ultra super robust and fast programming language, fully supportable by G++ and Clang

Cplusplusplus New ultra super robust and fast programming language, fully supportable by G++ and Clang How to use: Just write #include <C+++.h> in you

Vladimir Melnikov 1 Nov 29, 2021
Docker files and scripts to setup and run VINS-FUSION-gpu on NVIDIA jetson boards inside a docker container.

jetson_vins_fusion_docker This repository provides Docker files and scripts to easily setup and run VINS-FUSION-gpu on NVIDIA jetson boards inside a d

Mohamed Abdelkader Zahana 22 Dec 18, 2022
LIDAR(Livox Horizon) point cloud preprocessing, including point cloud filtering and point cloud feature extraction (edge points and plane points)

LIDAR(Livox Horizon) point cloud preprocessing, including point cloud filtering and point cloud feature extraction (edge points and plane points)

hongyu wang 12 Dec 28, 2022