So, this is the final stretch— only a few days left before our Kickstarter ends. Heading into the final stretch, we created a new video, a small video, a cat video! It shows Pixy playing with some lasers, and potentially bugging some cats, although no actual cats were annoyed in the video. Take a look!
Austin Kickstarter of the week
It’s been a great week for Pixy! We’re past our funding goal, and we’re generating some real excitement and good press. I just came across this in the Austin Post — we’re Austin Kickstarter of the Week! Austin is a small city by some standards, but we’re a pretty big Kickstarter city.
If you want to film an awesome Kickstarter video, Jorge Sanhueza-Lyon is your person. Contact him via Facebook or send me a note if you want more information. There is absolutely no doubt that we’re here talking about how great Pixy is doing because of his vision.
Here are some other places on the interwebs to read about our little Pixy:
Techcrunch
Hack-a-day
Azosensors
PCWorld
Scoop.it!
Solid State Technology
Ubergizmo
Electronics Weekly
WSJ Marketwatch
Will you like Pixy?
We put up a Facebook page for our little guy. It’s another way to spread the word of Pixy to your friends. Last I checked, we were up to 3 likes!
Pixy goes live on Kickstarter!
We’re super excited — this is our first Kickstarter! Check out the Kickstarter page here — it has all of the information about Pixy. But thanks for visiting our little corner of the web.
Here’s some backstory on how Pixy got started. A friend had sent me an email about two years ago, describing how he missed using the Xport Botball Controller, particularly the color vision system. We had designed the XBC for Botball.org, for use in STEM education. It used a Gameboy Advance as the main processor and employed an FPGA to provide the required I/O that was lacking on the Gameboy. With this system we were able to create a capable color vision system with a decent embedded GUI. You could give it 3 “model colors” that you were interested in and it would find objects of these colors in the image and report the results back. It could provide up to 25 frames per second of processed image data, which was impressive considering that the Gameboy only had a 16MHz processor (the Gameboy was released in 2001.) It got me thinking — what could be accomplished with the latest generation of embedded processors? NXP had just announced a new processor with 2 ARM cores, both running at 200MHz. The pricing was compelling. And you could have one core to handle the front-end — frame acquisition, low-level processing and the other core could handle higher level image processing — connected components, communication. It would be a simple design with just a processor and an imaging chip. Simple designs are compelling — easy to manufacture and maintain, and lower cost as well. I contacted Anthony Rowe at CMU (the inventor of the CMUcam, and now runs his own lab) with the idea and it turns out that he had just been discussing a CMUcam redesign that morning with a colleague. And so began the CMUcam5…