Just a quickie update this New Years Eve!
Our contract manufacturer shipped the first 2000 Pixys, and here they are, waiting for the lens attachment, the final firmware upload (when the final firmware is ready!) and packaging. More are on their way!
As far as software, we’ve been focusing on readying the image processing algorithms. One thing that’s worth noting is that we’ve upped the performance of the color blob algorithm by reducing its latency (the amount of time it takes for Pixy to sense something.)
The latency was pretty amazing at about 1.5 frames or 30ms, but now it’s as little as 0.5 frames or 10ms. This means among other things, that tracking objects is really dang fast.
Clarifying note— the update rate of Pixy is 50 frames per second (50 Hz), which is also the update rate of the image sensor. Latency can be independent of the update rate, though. So for example, latency can be several frames without affecting the update rate. USB webcams typically have a latency of about 100 ms for video frames. (Update rates for webcams are typically 30 Hz or 33 ms.) Add image processing and the latency is even suckier! (that’s for webcams— Pixy’s latency, which includes image processing, can be as little as 10 ms! dang!)
OK, enough of this latency and update crap! Let’s see a demo!
The USB cable is for power only— no batteries needed! But you could use batteries if you wanted to, by plugging them into the external power connector. (Pixy has a built-in regulator, so you can plug in a battery, 6.0V to 12V.)
The mechanism in the demo is basically the one we’re shipping, but with a few tweaks to improve it that extra little bit.
We’re changing the estimated ship date. There….. I said it.
We’re pushing it back, for the last time. You may feel inclined to go punch a kitten— or us… but please don’t…. As we get closer, the list of tasks gets smaller and the remaining tasks are easier to estimate, and so we’re getting a much clearer picture of where we are schedule-wise. Here’s what we’re seeing—
The hardware isn’t driving the schedule like we’d thought — software and firmware are determining the ship date. We have spent much the the last week shedding light on the remaining tasks and putting them into a revised schedule.
Here’s how it breaks down— we will reach code/feature completion the week of 1/27. We will then have a testing period, prepare the final release, which will push the ship date to the week of 2/17.
Pushing the ship date back is a big deal, we realize, but we want to give you the best information we have, even if it isn’t fun. And as much as we’d like to ship sooner, it would mean cutting corners and risking product quality. And that’s an even bigger no no.
So when we start shipping the week of 2/17, it will take 3 weeks to ship all of the rewards. Some backers, particularly those with the higher backer numbers, won’t receive their Pixys until well into March. We will lay out the shipping schedule (based on your backing number) closer to the ship date. And pre-orders (non-Kickstarter) will ship after all of the Kickstarter rewards ship, in the order received (of course). OK, that concludes the hard-truth portion of the update. (But I suspect the real pain will come at us in the comments!) Anyway, onto hardware—
Our contract manufacturer is cranking out Pixys full-tilt, and will continue to do so for several weeks. Here are some pictures they sent us yesterday.
In that last picture, the functional test, the last I’d heard they’d tested 600 with zero defects— crazy! We will receive our first shipment of 2000 before the end of the year. We’ll do the final testing, programming, lens attachment and packaging here in beautiful Austin. Speaking of packaging, we’re having custom boxes printed up for our little guy, with color codes printed right on the box! You know, as part of the out-of-the box experience.
The custom pan/tilt mechanism design has been successfully prototyped and parts have been ordered. Check out the pictures below—
The mechanism will come as a kit. It’s easy to assemble, and it’s compact (as you can see in the pics.) It sits comfortably on a flat surface, but you can take the base off and replace it with your robot, or whatever else you want to mount it to, to give your creation the ability to look around and find/track even more objects. It’s mechanically fast— faster than the pan/tilt mechanism in the Kickstarter video. (BTW, you can still pre-order a pan/tilthere.)
The mounting brackets that attach Pixy to the pan/tilt mechansim will come with your Pixy and allow for fixed mounting options, if you didn’t opt for a pan/tilt.
So hardware is on track. Software is going well, but slower than we’d all like. When we changed the ship date the first time (this is the 2nd and final time) someone commented that it looks like we don’t know what we’re doing. They’re right! (well… sorta.) It’s our first Kickstarter, our first design with these parts, our first camera with these capabilities and this price point.
This process, while not going exactly as planned, has been awesome— it’s given us an opportunity to demonstrate our idea to you, and for you to tell us whether you like it, what you like in particular, what you don’t like, and what you’d like to see more of. It’s changed Pixy. We will ship something better than we originally envisioned. That’s on you. That’s your fault! (but thank you!!!)
And we’re truly sorry this is taking longer than expected….
The next update in a couple weeks— including video!
First of all, thanks for your patience. We’re listening to you guys, and we’re hearing some frustration about the lack of updates. Our apologies for this— we’ll try to provide more updates and more detail going forward. We’ve been super busy, of course — but things are going well.
A quick thumbnail sketch before going into detail — assembled and tested hardware will start arriving in Austin after Christmas — a little ahead of schedule (yay!) Software is going well too. We’re a little behind, but working hard to get back on schedule. Read on for more detail!
We received the “first articles” from the contract manufacturer last week.
Some of the changes to the PCB involved moving the RGB LED farther from the lens so it’s easier to see (it’s at the bottom of the PCB now). We also added more mounting holes and gave it a “neck” to facilitate a tilt mount. And we made various tweaks to improve manufacturing. BTW, the final schematic is here for those interested.
We’ve been testing these guys, making sure everything works, etc. In short, they’re workin great! And we’ve given our contract manufacturer the official “go”. i.e., go make thousands! then send them to us! We wrote a really nice automated test program that they can use to test all features. They plug in a USB cable and it gives them a green-light (pass) or red-light (fail) so they can detect defects early and remedy on their end (India). The finished, tested hardware will start arriving here in Austin (in the thousands) after Christmas.
The lenses arrived yesterday from Asia!
Oh— and the pan/tilt mechanism— if you remember, we did a full custom design for the pan/tilt mechanism to get the best performance and quality. We received prototype laser-cut parts last week, and we’re going out for another revision. We’ve also received several samples of servos and have evaluated them for their speed and torque. Gonna hold off on the pictures for now…. but demos soon.
When we were thinking about what we wanted Pixy to do way back when, we laid out a list of basic things that needed to be in place, so PIxy wouldn’t just be a super fast color blob tracker, which is cool, but we wanted to be able to add new stuff (new functionality, algorithms, etc) without breaking existing stuff. And so things would run in limited memory, so you could debug easily, blah, blah. Here’s the list— straight from our notes (but formatted to look prettier):
USB communications stack: fast, low code/execution overhead, plug/unplug robust, portable across Windows, MacOS, Linux (including driver support).
Host communication framework: ability to plug a USB cable into a live system and receive debug messages and render imaging objects (and then unplug, resume program). Ability to query and configure parameters (white balance, brightness, color signatures, etc) through host tool (PixyMon). Ability to extend functionality by adding new interfaces and parameters (for example, it should be easy to add an XYZ detector module with its configuration and processing procedures, you know, to extend Pixy’s capabilities.)
Nonvolatile parameter support: provide a common interface for creating, deleting, loading and saving nonvolatile parameters (on the Pixy side). And provide the ability to list/query/index parameters through host communications framework so we can display them and modify them through Pixymon.
Simple data communication support: ability to stream processed information through either SPI, I2C, UART serial, GPIO or analog out. In particular, be able to talk to Arduino through ICSP connector, which has no slave select signal.
Firmware upload: ability to upload new firmware. Can’t rely on existing firmware state, existing bootloader, etc. Be able to do this easily through PixyMon — File->Load Firmware, or similar (no 3rd party tools).
This is the “low level stuff”— and we’ve been focusing on the LLS for pretty much all of November— and happily, we’re pretty much done, and it all works great, which is good, because a shaky implementation would lead to problems down the road. But it took longer than we had expected.
Some of it we had to implement from scratch because the processor is new. Some of it we wanted to get close to perfect instead of good enough. And some of it we don’t have yet, like full Linux support— yet— but we’ve been careful to make the right decisions and not program ourselves into a corner, so to speak.
With these things taken care of, we’re ready to start on more of the fun/interesting stuff– like getting the color recognition software/algorithms cleaned up and readied. So that will be our focus until we ship— the software that actually does the detection and tracking— the higher level stuff. This is much more fun to talk about and show off, makes for better updates.
We have a ship date of January 17 as our goal — looking at the list of tasks, it will be tight. It’s a little early to tell, but in 2 weeks we’ll have a much better idea, and we’ll provide an update then, which details the tasks and schedule all the way until the ship date. In the meantime, we’ll be here, fightin the good fight!
Thank you again for your support and your patience!
Our little Pixy, with flaxen hair and bookish good looks set out on Kickstarter 30 days ago with a $25,000 goal and a crazy idea: to bring fast, easy-to-use machine vision to robots everywhere. He (she?) passed 1000% funding before ending an awesome Kickstarter campaign. We find this hard to believe!! Thank you to everyone that contributed!
And if you missed our Kickstarter, no worries! You can pre-order a Pixy here. Thanks!
So, this is the final stretch— only a few days left before our Kickstarter ends. Heading into the final stretch, we created a new video, a small video, a cat video! It shows Pixy playing with some lasers, and potentially bugging some cats, although no actual cats were annoyed in the video. Take a look!
It’s been a great week for Pixy! We’re past our funding goal, and we’re generating some real excitement and good press. I just came across this in the Austin Post — we’re Austin Kickstarter of the Week! Austin is a small city by some standards, but we’re a pretty big Kickstarter city.
If you want to film an awesome Kickstarter video, Jorge Sanhueza-Lyon is your person. Contact him via Facebook or send me a note if you want more information. There is absolutely no doubt that we’re here talking about how great Pixy is doing because of his vision.
Here are some other places on the interwebs to read about our little Pixy:
We’re super excited — this is our first Kickstarter! Check out the Kickstarter page here — it has all of the information about Pixy. But thanks for visiting our little corner of the web.
Here’s some backstory on how Pixy got started. A friend had sent me an email about two years ago, describing how he missed using the Xport Botball Controller, particularly the color vision system. We had designed the XBC for Botball.org, for use in STEM education. It used a Gameboy Advance as the main processor and employed an FPGA to provide the required I/O that was lacking on the Gameboy. With this system we were able to create a capable color vision system with a decent embedded GUI. You could give it 3 “model colors” that you were interested in and it would find objects of these colors in the image and report the results back. It could provide up to 25 frames per second of processed image data, which was impressive considering that the Gameboy only had a 16MHz processor (the Gameboy was released in 2001.) It got me thinking — what could be accomplished with the latest generation of embedded processors? NXP had just announced a new processor with 2 ARM cores, both running at 200MHz. The pricing was compelling. And you could have one core to handle the front-end — frame acquisition, low-level processing and the other core could handle higher level image processing — connected components, communication. It would be a simple design with just a processor and an imaging chip. Simple designs are compelling — easy to manufacture and maintain, and lower cost as well. I contacted Anthony Rowe at CMU (the inventor of the CMUcam, and now runs his own lab) with the idea and it turns out that he had just been discussing a CMUcam redesign that morning with a colleague. And so began the CMUcam5…