TrackingPoint takes (automated) aim at the future of shooting
TrackingPoint’s iPhone-connected, guided shooting solution has not only made a big splash in the gun press, but the company’s debut at CES got it plenty of coverage in the mainstream tech press, as well. If you’re not familiar with TrackingPoint, my former colleague at Ars Technica, Lee Hutchinson, has written a great intro to the product. In brief TrackingPoint makes a line of three bolt-action hunting rifles, each of which comes with an integrated scope. What’s special about TrackingPoint’s scope + gun combo, is that the scope allows you to “tag” a target, and then the gun will fire when you put the crosshairs over the tag.
I got a chance to shoot a TrackingPoint gun at SHOT’s Media Day, and I can testify that it definitely works. I’ve never before attempted at 1,000-yard shot, but with the TP rifle I hit the target both times I fired at it. I didn’t quite score a bullseye due to effect of wind on my shot (more on this issue, below), but I would definitely have taken out any medium to large game animal at that distance with it.
The next day at SHOT, I got a chance to sit down with TrackingPoint’s John Lupher, one of the main engineers behind the scope/gun combo to talk about how it all works, and what the company’s plans are for future products.
The nuts and bolts of a perfect shot
The TrackingPoint scope is essentially a digital image processing system, the core of which consists of four primary processors.
The first main processor is the 14 megapixel CMOS image sensor that captures the real-time imagery for the scope. The sensor captures video as 3600×1600 pixel frames at a rate of 54 frames/second.
This video feed flows into three different parallel paths:
1. The display for the user. This includes both the display that’s visible to the shooter via the scope’s eyepiece, and the video that’s streamed from the scope’s built-in WiFi server to any connected iPhone or iPad.
2. The on-board video recording device, which is a kind of “black box” that records footage of shots.
3. The PIPE, or Predictive Input Processing Engine, which is what does the actual tagging and tracking. This last system uses black-and-white, luminance-only data, instead of the full-color video feed.
Upgradable firmware, and a Wal-Mart version
Most of the image processing is done in the scope’s FPGA, which is a type of chip that can be “rewired” via software. At the heart of the FPGA’s image processing capabilities is the 32-bit Nios II “soft core” from Altera, which the system uses to configure and control the parts of the FPGA that implement functions like automatic gain adjustment, sharpening algorithms, light level control, etc. This virtual CPU runs at 200Mhz, so it’s not exactly high-performance, but it’s plenty fast enough to process each frame within the allotted 18.5ms time window.
Lupher told me that TrackingPoint is constantly refining its image-processing algorithms, and that the system allows for customer upgrades of the FPGA in the field. You can download the latest firmware to the iPhone and update the scope.
“We can do new releases, and allow for apps that have new features, and they’ll get programmed into the FPGA after shipment of the product,” said Lupher.
FPGAs are quite costly, and this part adds significantly to the system’s hefty price tag, but the company uses the technology because of its flexibility and extensibility. However, cheaper iterations of TrackingPoint’s system will ditch the FPGA for a lower-cost ASIC.
“If we go to a lower price-point product, which we have some on our roadmap, and the volumes go up, then we’ll be looking at doing an ASIC, as well,” Lupher told us. “That will depend on the volumes that we see and the feature set that we want to implement. There’s no question in a couple of years that we’ll have an ASIC for a lower price point product — the Wal-Mart version of this. We’re probably 3 years away from that.”
Linux and ARM
The actual tracking algorithms, which render the tag point on each frame of video, is done with a TI DSP chip. The TI DSP searches each frame for the proper location of the tag point, a feat that happens at 54 FPS.
The scope also includes an ARM processor running Linux that handles housekeeping, networking, user interface, and other chores that aren’t real-time. So the functions that can tolerate some latency are done on the ARM core with Linux, but the DSP and FPGA run everything else in realtime with 18.5 millisecond/frame processing times.
Military possibilities
TrackingPoint is a new company was founded in February of 2011 solely for the purpose of making what the company calls “precision guided firearms”. The primary founder is John McHale, a serial entrepreneur now on his fifth startup. McHale and some early employees and board members provided a significant amount of TrackingPoint’s early funding, and the rest comes from Austin Ventures in Austin TX.
I asked Lupher if TrackingPoint was intended for the military market, and he replied that the company’s strategy is to produce a civilian product first and use that to get interest from the military.
“We thought we’d go to the commercial market first and see how that works, in terms of the military coming to us”, Lupher said. ” So far it’s worked out quite well, and we’ve got a lot interest from some components of the military. We’ll build a milspec product at some point. We’ll probably kick off a project to do that in the next six months.”
TrackingPoint’s current product is built to commercial standards, but the company does extensive testing for shock resistance and waterproofing.
I asked Lupher where TrackingPoint is headed next, and he told me that most of their current research is focused on doing automated wind reading. Right now, the only user input for the scope is a wind setting. My own experience at Media Day indicates that the wind setting is hard to get right, since many of us were missing direct bullseyes due to the heavy wind at the range. TrackingPoint’s plan is to fully automate the long-distance shooting solution by taking into account real-time wind measurements.
TrackingPoint is also looking at night vision solutions, and their research team is working on algorithm improvements for low-light and night imaging. Clearly, this is toward the goal of producing a product for the military and/or law enforcement.
Finally, the company is also working to improve the scope’s tracking algorithms in order to make the tracking of moving targets more robust. They want the scope to be able to track a target during changes in light levels, and to allow for more gun motion.
Varmint hunters and drop-in kits
I asked Lupher about the possibility of a drop-in kit, which would let users add TrackingPoint’s technology to their existing rifle.
“At some point, we’ll come out with a rifle scope that you can put on your own gun, and that could be within the year”, Lupher told me. “But there’s a few things we have to decide on this roadmap, and how our customers respond to this launch is going to affect that.”
I asked about a drop-in kit for the AR-15 platform, and Lupher said that there was some hesitation about committing resources to the platform when current politics have placed a question mark by its future in the civilian market.
Lupher told me that they’re looking at the varmint space as a possibility for their next product, and he mentioned prairie dog hunting in specific.
Whatever TrackingPoint ends up doing with the drop-in kit idea, it won’t be quite as accurate as their full package. In order to control accuracy, they they to control everything about the firing and the ammunition. They have to know the ballistics and tolerances for the rifle, and they also have to know the variability in those parameters between individual units that come off the production line. So to get the ultimate in accuracy, they make the complete package; and as they branch out into partial packages, accuracy will necessarily take a hit.
“If we went to a different rifle or a customer-provided rifle, we’re not going to be as accurate,” Lupher said. “At some point, that will be acceptable. But right now we’re trying to set a standard for the company, and to set as high a bar as we can for the brand. We don’t want to be known for almost making the shot — we want to be known for making the shot.”
Photos © Bryan William Jones.
More by Jon Stokes
Comments
Join the conversation
kinda cool, and with all the smartphones out there more stuff like this is to come.
the image processing isn't too tough to do - with openCV i have a similar demo working on Android 2.2 - i can select a set of pixels to track, and the camera tracks them. when those pixels fall under my drawn reticle, i light up an LED.
the mechanics are a bit trickier - but a solenoid (right now using a car's electric lock solenoid) works fine if reaction time doesn't need to be too fast.
a $50 smartphone off ebay, a $5 solenoid, and i'm a long-distance shooter!
Instead of rifles, it will vastly more useful mounted on RPGs.
Instead of tag and track, use optoelectronic recognition or acoustic one. Or thermal if
If now they can pinpoint rifle shots, they can pinpoint rotor or engine sounds.
Imagine downing helos will be piece of cake,
hitting fast moving cars too. It will be ultimate triggerbot of real world.
Like a dude from Inception,
"We mustn't be afraid to dream a little bigger darling~"