Announcing…Pixy2!

Hi folks, we are so excited to announce the release of Pixy2! That’s right, your favorite vision sensor has been completely upgraded, hardware and firmware alike.

Here’s a quick video to catch you up to speed:

New features include:

  • Smaller, faster and lower cost
  • Easy line tracking with new line-following algorithms
  • Built-in light source (dual LEDs) for illuminating what Pixy is looking at
  • Redesigned Pan/Tilt unit
  • Available now at distributors around the world!

Click here to purchase Pixy2 today!

4 awesome videos from Pixy robot builders

What have people been making with Pixy? Lots of cool stuff, apparently! Every week we scour Youtube for fresh videos from Pixy users around the world. We add the most popular ones to our Youtube playlist called “What Can You Make With A Pixy?”

Read on for a few of our favorites…

Torbjorn’s Turret – a real-life recreation of an auto-tracking cannon from the game “Overwatch.” Burly motors and a manual mode, complete with laser-assisted first-person video targeting(!) make this a badass build! Continue reading “4 awesome videos from Pixy robot builders”

Pixy for LEGO Mindstorms

I wanted to share some exciting news.  We sent out a survey a while back, and we received an incredibly enthusiastic response for adding LEGO Mindstorms support for Pixy.  So, we’ve been working hard since then, and we’re happy to announce that it’s ready!

We have a new version of Pixy that works with LEGO Mindstorms EV3 and NXT!

It’s available now at Robotshop and Amazon.com.

Thanks so much for your support! Check out the video and product details below.

 

Robot vision made easy

Pixy is a fast vision sensor for DIY robotics and similar applications. You can teach Pixy an object just by pressing a button. Use Pixy for lots of different applications.  It’s capable of tracking hundreds of objects simultaneously and only provides the data you are interested in.

45500
Connect directly to LEGO Mindstorms EV3 and NXT

Pixy LEGO uses custom firmware that speaks the LEGO sensor protocol.  A special cable is provided that plugs directly into your LEGO brick.  You’ll be up and running fast!

 pixyblock
Seamlessly integrates into the LEGO Mindstorms EV3 Environment

Pixy becomes part of the LEGO visual programming environment.  We provide software blocks for Pixy that make writing LEGO programs as easy as drag and drop!

Open Source Hardware logo
Want to dig deeper? Pixy is entirely open-source (hardware, firmware and software). We offer a helpful wiki, and you can always ask a question on our forums.

Available through Robotshop and Amazon.com

large-robotshop-logo front3-LEGO

Pixy CMUcam5 is a collaboration between Charmed Labs and Carnegie Mellon University.
LEGO Mindstorms EV3 and NXT are trademarks of the LEGO Group.  Neither Charmed Labs nor Carnegie Mellon University is associated with the LEGO Group.  

We made it to the Ben Heck Show!

This is a big day for Pixy. Our gallant vision sensor has been featured as part of a sweet automated video camera build on the Ben Heck Show! In the 2-part video below, he shows how he manufactures his own pulley and pan/tilt system, along with a custom Pixy mount that attaches to the hot shoe of the video camera. We’re stoked to be featured on the show! Dig the episode:

New firmware and software release for Pixy!

Hi Everybody,

It’s been awhile, but we have a major new firmware and software release for Pixy (Version 2.0). Here’s what’s in it:

  • Improved detection accuracy
  • Simplified and effective color signature adjustments
  • Improved button-teach method
  • Improved implementation of color codes
  • New features added to the serial interfaces
  • Saving and loading of onboard Pixy parameters

You can download here (new firmware included):

Here’s a video with the highlights of this release:

OK, that’s the big news — some smaller news — we have a Python API for our Linux, Raspberry Pi and BeagleBone Black users. And we’ve added lots of new distributors including distributors in Canada, Australia, China, Japan, Korea, Singapore, Italy, and two in Germany. (If you know of an online retailer that you think would be a good fit for Pixy, please send us a note!)

But back to the new release…

Way back in July we sent out a survey asking you what you what you wanted us to work on. And we got a significant percentage of you expressing a desire for improved detection accuracy, so that’s just what we did — we paused what we were working on and refocused our efforts like little Pixys tracking lasers, and we rewrote large sections of Pixy’ s firmware with the goal of improving detection accuracy. It took us a long time, but we’re happy with the results.

Improved detection accuracy — a weakness of previous firmware versions was false positives, where Pixy detected things that you didn’t intend. Pixy’s new firmware is more robust, using a more accurate color filtering algorithm. (The new color filtering algorithm is more computationally expensive too, but we spent lots of time optimizing, and it runs at 50 frames per second like before — yay!)

Simplified and effective color signature adjustments — another problem with previous firmware versions was the fact that it was somewhat difficult to “tweak” things if Pixy didn’t reliably detect the object you taught it. There was minimum saturation, hue range and a couple other parameters that you could adjust, but it was unintuitive, and you always needed to re-teach after tweaking things. The new firmware uses a simple slider for each color signature — slide it to the left and your color signature will be less inclusive, slide it to the right and your color signature will be more inclusive — everything is adjusted on-the-fly. It’s super easy… and kinda fun to be honest, but we’re sorta weird in that way.

Improved button-teach method — you’ve always been able to teach Pixy an object by pressing the button and holding your object in front of Pixy. This feature had room for improvement though. Sometimes Pixy would complain that the object’s hue wasn’t saturated enough. Sometimes it would learn your object, but detection accuracy would be an issue. The new firmware can learn objects with a huge range of color saturation levels. And when it learns an object, the detection accuracy is greatly improved.

Improved implementation of color codes — you may have noticed that our color code implementation never made it out of beta status. That’s because we simply weren’t happy with it. In this firmware version color codes are much improved — more accurate and easier to use.

New features added to the serial interfaces — many of you wanted to be able to control Pixy’s camera brightness/exposure as well as Pixy’s RGB LED from an Arduino. These controls have been added and are in the Arduino API. And we’ve added pan/tilt servo control to the UART and I2C interfaces in addition to the camera brightness and LED controls. We’ve also added “SPI with slave select” as an interface option.

Saving and loading of onboard Pixy parameters — you can save Pixy’s parameters, including color signatures, on your computer and restore them to your Pixy or even copy them to another Pixy. This was in the previous beta release, but it’s also been improved.

More developments coming!

Many of you have asked us what’s the status of the GCC-compatible version of the firmware (what we’re calling the “Firmware SDK”). It’s next on our list. Much of the work is already done — so we’re hoping it will be released soon. And we’re going to release a face detection algorithm after that. These projects have been piled up behind this release and they’ve been running far behind schedule, so we’ll be glad to move onto these next tasks and get them moving toward the door.

We can always use help — if you’re a developer of any sort and want to help with the CMUcam5 Pixy project, please send us a note!

Thanks!

The Pixy Team

New Kickstarter that uses Pixy

IR-LOCK Kickstarter
IR-LOCK Kickstarter

Our friends at IR-LOCK launched their Kickstarter yesterday and it’s off to a great start — grabbing Staff Pick status!  IR-LOCK does something lots of our customers have requested — robust beacon tracking.  IR beacon tracking is robust under practically all conditions, even when it’s dark.  We see some really cool applications for quad-rotors/drones, and we look forward to seeing what people do with it.  And we’re looking forward to working with the IR-TRACK folks!  Congrats guys!