Search This Blog

Monday, August 29, 2016

NFL Reportedly Using Ball Tracking Chip Sensors in 2016 Pre-Season Games

As reported by EngadgetThe NFL is using sensors inside footballs during pre-season to track quarterback throwing speeds, running back acceleration, ball position and other stats, according to Recode. The chips are reportedly made by Zebra, a company that already already tracks player statistics for the league using shoulder pad-mounted chips. The NFL used the same ball tracking tech before at the Pro Bowl last year, but the experiment is a first for pre-season. Officials haven't decided if they'll continue it once the regular season starts. 

Zebra teamed up with Wilson to install the RFID-like chips under the football's laces. Sensors located around the stadium can ping the chips and give stats like velocity, acceleration and ball location (within six inches) to Zebra employees within a half second. (The sensors can't track a ball's air pressure to prevent another Deflategate, though.) The NFL is also tracking kicking balls to see if the goalposts should be moved closer together, but that seems to be a different experiment.

Zebra's shoulder pad trackers, now used by all 32 teams, collect data that can be used to evaluate personnel, scout players and improve safety. It could also provide interesting data to broadcasters, though there's no indication the league has allowed that yet. Last year, the NFL released the shoulder pad data at the end of the season, but in 2016, it will reportedly give it to teams just hours after games end. If the Zebra's ball tracking tech is adopted the same way, the devices should soon arrive to regular season games, giving teams (and hopefully fans) more stats to geek out on.

SpaceX's Rival ULA Ramps-Up Plans With 'Space Trucks'

As reported by Christian Science MonitorEver since NASA began contracting with commercial companies to get cargo to the International Space Station (ISS), SpaceX has been the face of commercial space shipping, despite several prominent competitors. Now, the company may have a serious rival.
Tony Bruno, chief executive officer of the United Launch Alliance (ULA), a partnership between aerospace engineering companies Lockheed Martin and Boeing, discussed the company’s plans to create a cargo carrier that he nicknamed the “space truck” with news outlet Quartz this week.
Does this signal a shift towards greater specialization in the space engineering market?
SpaceX is not the only company sending shipments of food, experiments, and other supplies to the international space station, but it is the best known – and offers a significantly lower cost. The company’s successful quest to create a reusable rocket has prompted rivals to aggressively seek ways to lower their costs.
The United Launch Alliance is one of those rivals. The company’s launch contract with NASA is due to expire in 2019, meaning that ULA needs to innovate to remain relevant.
Mr. Bruno’s vision for ULA’s future is expansive, and includes plans for space infrastructure that can support lunar colonization by 2020. To that end, the company’s current project aims to make it cheaper and easier to get into space.
Like SpaceX, ULA hopes to achieve its aims by developing a reusable rocket. According to Bruno, unlike rival SpaceX, ULA’s rocket will have a reusable second stage (the portion of the rocket that finishes the journey) as well as a reusable first stage (the stage that propels the rocket from Earth into orbit).
And unlike SpaceX, which has developed the technology to bring reusable rockets back to Earth, ULA plans to leave the reusable second stages in space.
“We realized that you don’t have to bring it back in order for it to be reusable,” Bruno told Quartz “That’s the big paradigm change in the way that you look at the problem – if you have an upper stage that stays on orbit and is reusable.”
ULA’s second stage design looks like a fuel tank, and can be refueled and reloaded while still in orbit, where it would wait for cargo loads sent up from Earth. Due to the relief rockets (ULA’s “space trucks”) waiting in orbit, a cargo load could be incredibly heavy and still be able to make it to its final destination, whether that be a lunar colony or the ISS.
Once you have these second stage fuel capsules in space, Bruno says, “It starts becoming practical to construct large-scale infrastructure and support economic activities in space, a transportation system between here and the moon, practical microgravity manufacturing, commercial habitats, prospecting in the asteroids.”
As of last year, ULA also had plans to develop a reusable first stage Vulcan rocket that could be recovered in mid-air. As with its most recently announced second-stage plan, the company placed cost-effectiveness at a premium, with senior staff engineer Mohamed Ragab telling SpaceNews: 
“If you work the math, you see that you’re carrying a lot of fuel to be able to bring the booster back and it takes much longer to realize any savings in terms of the number of missions that you have to fly – and they need to be all successful.”
While there are several other companies with commercial space ambitions and Silicon Valley interest, they are lesser known entities such as Moon Express and Planetary Resources.
Much of today’s space innovation is occurring at the commercial level, making it ever more valuable for companies such as ULA and SpaceX to specialize. 
On Thursday, ULA announced that it has been chosen by NASA to launch the next Mars rover exploration project in 2020. Even the ISS could soon be commercially run.
"Ultimately, our desire is to hand the space station over to either a commercial entity or some other commercial capability,” said NASA’s deputy associate administrator for exploration systems development Bill Hill last Sunday, “so that research can continue in low-earth orbit, we figure that will be in the mid-20s."

Tesla Autopilot Crash Exposes Industry Divide

As reported by IEEE SpectrumThe first death of a driver in a Tesla Model S with its Autopilot system engaged has exposed a fault line running through the self-driving car industry. In one camp, Tesla and many other carmakers believe the best route to a truly driverless car is a step-by-step approach where the vehicle gradually extends control over more functions and in more settings. Tesla’s limited Autopilot system is currently in what it calls “a public beta phase,” with new features arriving in over-the-air software updates.

Google and most self-driving car startups take an opposite view, aiming to deliver vehicles that are fully autonomous from the start, requiring passengers to do little more than tap in their destinations and relax.
The U.S. National Highway Traffic Safety Administration (NHTSA) classifies automation systems from Level 1, sporting basic lane-keeping or anti-lock brakes, through to Level 4, where humans need never touch the wheel (if there is one).
A Level 2 system like Tesla’s Autopilot can take over in certain circumstances, such as highways, but requires human oversight to cope with situations that the car cannot handle—such as detecting pedestrians, cyclists, or, tragically, a white tractor-trailer crossing its path in bright sunlight.
Proponents of Level 4 technologies say that such an incremental approach to automation can, counter-intuitively, be more difficult than leap-frogging straight to a driverless vehicle. “From a software perspective, Level 2 technology may be simpler to develop than Level 4 technology,” says Karl Iagnemma, CEO of autonomous vehicle startup nuTonomy. “But when you include the driver, understanding, modeling and predicting behavior of that entire system is in fact pretty hard.”
Anthony Levandowski, who built Google’s first self-driving car and now runsautonomous trucking startup Otto, goes even further. “I would expect that there would be plenty of crashes in a system that requires you to pay attention while you’re driving,” he says. “It’s a very advanced cruise control system. And if people use it, some will abuse it.”
Even if drivers are following Tesla’s rules—keeping their hands on the wheel and trying to pay attention to the road—many studies have shown that human motorists with little to do are easily distracted.
At this point, of course, Tesla is extremely unlikely to remotely deactivate the Autopilot system until it reaches Level 4. So what are its options? Experts think that in one respect, at least, Tesla is on the right track. “Putting the self-driving hardware on all your vehicles then activating it with a software update later seems like a great idea,” says Levandowski. “It’s shocking that nobody else did that.”
“There are very few examples of software of this scale and complexity that are shipped in perfect form and require no updating,” agrees Iagnemma. “Developers will certainly need to push out updates, for either improved performance or increased safety or both.”
One sensor noticeably absent from Tesla’s Model S and X is lidar—the laser ranging system favored by the majority of autonomous car makers. It can build up a 360-degree image of a vehicle’s surroundings in the blink of an eye. “The introduction of an additional sensor would help improve system performance and robustness,” says Iagnemma. “What Tesla was thinking, I believe, is that maybe a lidar sensor wasn’t necessary because you have the human operator in the loop, acting as a fail-safe input.”
Mark Halverson is CEO of transportation automation company Precision Autonomy and a member of the IEEE Global Initiative for Ethical Considerations in the Design of Autonomous Systems. He thinks that roads with a mix of connected human drivers and self-driving cars would benefit from a cloud-based traffic management system like the one NASA is developing for drones.
“In this accident, the truck driver and the Tesla driver both knew where they were going,” he says. “They had likely plugged their destinations into GPS systems. If they had been able to share that, it would not have been that difficult to calculate that they would have been at the same position at the same time.”
Halverson thinks that a crowdsourced system would also avoid the complexities and bureaucratic wrangling that have dogged the nearly decade-long effort to roll out vehicle-to-vehicle (V2V) technologies. “A crowdsourcing model, similar to the Waze app, could be very attractive because you can start to introduce information from other sensors about road conditions, pot holes, and the weather,” he says.
Toyota, another car company that favors rolling out safety technologies before they reach Level 4, has been struggling with the same issues as Tesla. Last year, the world’s largest carmaker announced the formation of a US $1-billion AI research effort, the Toyota Research Institute, to develop new technologies around the theme of transportation. The vision of its CEO, Gill Pratt, is of “guardian angel” systems that allow humans to drive but leap in at the last second if an accident seems likely. His aspiration is for vehicles to cause fatal accidents at most once every trillion miles.
Such technologies might not activate in the entire lifetime of a typical driver, requiring all the expensive hardware and software of a driverless Level 4 vehicle but offering none of its conveniences. “It’s important to articulate these challenges, even if they’re really hard,” says John Leonard, the MIT engineering professor in charge of automated driving at TRI. “A trillion miles is a lot of miles. If I thought it would be easy, I wouldn’t be doing it.”
Elon Musk will surely be hoping that the next Autopilot accident, when it inevitably comes, will be nearly as many miles off.

Wednesday, August 24, 2016

NVIDIA's Made-For-Autonomous-Cars CPU is Freaking Powerful

As reported by Engadget:NVIDIA debuted its Drive PX2 in-car supercomputer at CES in January, and now the company is showing off the Parker system on a chip powering it. The 256-core processor boasts up to 1.5 teraflops of juice for "deep learning-based self-driving AI cockpit systems," according to a post on NVIDIA's blog. That's in addition to 24 trillion deep learning operations per second it can churn out, too. For a perhaps more familiar touchpoint, NVIDIA says that Parker can also decode and encode 4K video streams running at 60FPS -- no easy feat on its own.

However, Parker is significantly less beefy than NVIDIA's 
other deep learning initiative, the DGX-1 for Elon Musk's OpenAI, which can hit 170 teraflops of performance. This platform still sounds more than capable of running high-end digital dashboards and keeping your future autonomous car shiny side up without a problem, regardless.

On that front, NVIDIA says that in addition to the previously-announced partnership with Volvo (which puts Drive PX2 into the XC90), there are currently "80 carmakers, tier 1 suppliers and university research centers" using Drive PX2 at the moment.

Thursday, August 18, 2016

Uber’s First Self-Driving Fleet Arrives in Pittsburgh This Month

As reported by BloombergNear the end of 2014, Uber co-founder and Chief Executive Officer Travis Kalanick flew to Pittsburgh on a mission: to hire dozens of the world’s experts in autonomous vehicles. The city is home to Carnegie Mellon University’s robotics department, which has produced many of the biggest names in the newly hot field. Sebastian Thrun, the creator of Google’s self-driving car project, spent seven years researching autonomous robots at CMU, and the project’s former director, Chris Urmson, was a CMU grad student. 

“Travis had an idea that he wanted to do self-driving,” says John Bares, who had run CMU’s National Robotics Engineering Center for 13 years before founding Carnegie Robotics, a Pittsburgh-based company that makes components for self-driving industrial robots used in mining, farming, and the military. “I turned him down three times. But the case was pretty compelling.” Bares joined Uber in January 2015 and by early 2016 had recruited hundreds of engineers, robotics experts, and even a few car mechanics to join the venture. The goal: to replace Uber’s more than 1 million human drivers with robot drivers—as quickly as possible.

The plan seemed audacious, even reckless. And according to most analysts, true self-driving cars are years or decades away. Kalanick begs to differ. “We are going commercial,” he says in an interview with Bloomberg Businessweek. “This can’t just be about science.”

Starting later this month, Uber will allow customers in downtown Pittsburgh to summon self-driving cars from their phones, crossing an important milestone that no automotive or technology company has yet achieved. Google, widely regarded as the leader in the field, has been testing its fleet for several years, and Tesla Motors offers Autopilot, essentially a souped-up cruise control that drives the car on the highway. Earlier this week, Ford announced plans for an autonomous ride-sharing service. But none of these companies has yet brought a self-driving car-sharing service to market.

Uber’s Pittsburgh fleet, which will be supervised by humans in the driver’s seat for the time being, consists of specially modified Volvo XC90 sport-utility vehicles outfitted with dozens of sensors that use cameras, lasers, radar, and GPS receivers. Volvo Cars has so far delivered a handful of vehicles out of a total of 100 due by the end of the year. The two companies signed a pact earlier this year to spend $300 million to develop a fully autonomous car that will be ready for the road by 2021.

The Volvo deal isn’t exclusive; Uber plans to partner with other automakers as it races to recruit more engineers. In July the company reached an agreement to buy Otto, a 91-employee driverless truck startup that was founded earlier this year and includes engineers from a number of high-profile tech companies attempting to bring driverless cars to market, including Google, Apple, and Tesla. Uber declined to disclose the terms of the arrangement, but a person familiar with the deal says that if targets are met, it would be worth 1 percent of Uber’s most recent valuation. That would imply a price of about $680 million. Otto’s current employees will also collectively receive 20 percent of any profits Uber earns from building an autonomous trucking business.

Otto has developed a kit that allows big-rig trucks to steer themselves on highways, in theory freeing up the driver to nap in the back of the cabin. The system is being tested on highways around San Francisco. Aspects of the technology will be incorporated into Uber’s robot livery cabs and will be used to start an Uber-like service for long-haul trucking in the U.S., building on the intracity delivery services, like Uber Eats, that the company already offers.

The Otto deal is a coup for Uber in its simmering battle with Google, which has been plotting its own ride-sharing service using self-driving cars. Otto’s founders were key members of Google’s operation who decamped in January, because, according to Otto co-founder Anthony Levandowski, “We were really excited about building something that could be launched early.”Levandowski, one of the original engineers on the self-driving team at Google, started Otto with Lior Ron, who served as the head of product for Google Maps for five years; Claire Delaunay, a Google robotics lead; and Don Burnette, another veteran Google engineer. Google suffered another departure earlier this month when Urmson announced that he, too, was leaving.



“The minute it was clear to us that our friends in Mountain View were going to be getting in the ride-sharing space, we needed to make sure there is an alternative [self-driving car],” says Kalanick. “Because if there is not, we’re not going to have any business.” Developing an autonomous vehicle, he adds, “is basically existential for us.” (Google also invests in Uber through Alphabet’s venture capital division, GV.)

Unlike Google and Tesla, Uber has no intention of manufacturing its own cars, Kalanick says. Instead, the company will strike deals with auto manufacturers, starting with Volvo Cars, and will develop kits for other models. The Otto deal will help; the company makes its own laser detection, or lidar, system, used in many self-driving cars. Kalanick believes that Uber can use the data collected from its app, where human drivers and riders are logging roughly 100 million miles per day, to quickly improve its self-driving mapping and navigation systems. “Nobody has set up software that can reliably drive a car safely without a human,” Kalanick says. “We are focusing on that.”

In Pittsburgh, customers will request cars the normal way, via Uber’s app, and will be paired with a driverless car at random. Trips will be free for the time being, rather than the standard local rate of $1.05 per mile. In the long run, Kalanick says, prices will fall so low that the per-mile cost of travel, even for long trips in rural areas, will be cheaper in a driverless Uber than in a private car. “That could be seen as a threat,” says Volvo Cars CEO Hakan Samuelsson. “We see it as an opportunity.”

Although Kalanick and other self-driving car advocates say the vehicles will ultimately save lives, they face harsh scrutiny for now. In July a driver using Tesla’s Autopilot service died after colliding with a tractor-trailer, apparently because both the driver and the car’s computers didn’t see it. (The crash is currently being investigated by the National Highway Traffic Safety Administration.) Google has seen a handful of accidents, but they’ve been less severe, in part because it limits its prototype cars to 25 miles per hour. Uber’s cars haven’t had any fender benders since they began road-testing in Pittsburgh in May, but at some point something will go wrong, according to Raffi Krikorian, the company’s engineering director. “We’re interacting with reality every day,” he says. “It’s coming.”

For now, Uber’s test cars travel with safety drivers, as common sense and the law dictate. These professionally trained engineers sit with their fingertips on the wheel, ready to take control if the car encounters an unexpected obstacle. A co-pilot, in the front passenger seat, takes notes on a laptop, and everything that happens is recorded by cameras inside and outside the car so that any glitches can be ironed out. Each car is also equipped with a tablet computer in the back seat, designed to tell riders that they’re in an autonomous car and to explain what’s happening. “The goal is to wean us off of having drivers in the car, so we don’t want the public talking to our safety drivers,” Krikorian says.

On a recent weekday test drive, the safety drivers were still an essential part of the experience, as Uber’s autonomous car briefly turned un-autonomous, while crossing the Allegheny River. A chime sounded, a signal to the driver to take the wheel. A second ding a few seconds later indicated that the car was back under computer control. “Bridges are really hard,” Krikorian says. “And there are like 500 bridges in Pittsburgh.”


Bridges are hard in part because of the way that Uber’s system works. Over the past year and a half, the company has been creating extremely detailed maps that include not just roads and lane markings, but also buildings, potholes, parked cars, fire hydrants, traffic lights, trees, and anything else on Pittsburgh's streets. As the car moves, it collects data, and then using a large, liquid-cooled computer in the trunk, it compares what it sees with the preexisting maps to identify (and avoid) pedestrians, cyclists, stray dogs, and anything else. Bridges, unlike normal streets, offer few environmental cues—there are no buildings, for instance—making it hard for the car to figure out exactly where it is. Uber cars have Global Positioning System sensors, but those are only accurate within about 10 feet; Uber’s systems strive for accuracy down to the inch.

When the Otto acquisition closes, likely this month, Otto co-founder Levandowski will assume leadership of Uber’s driverless car operation, while continuing to oversee his company's robotic trucking business. The plan is to open two additional Uber R&D centers, one in the Otto office, a cavernous garage in San Francisco’s Soma neighborhood, a second in Palo Alto. “I feel like we’re brothers from another mother,” Kalanick says of Levandowski.

The two men first met at the TED conference in 2012, when Levandowski was showing off an early version of Google’s self-driving car. Kalanick offered to buy 20 of the prototypes on the spot—“It seemed like the obvious next step,” he says with a laugh—before Levandowski broke the bad news to him. The cars were running on a loop in a closed course with no pedestrians; they wouldn't be safe outside the TED parking lot. “It was like a roller coaster with no track,” Levandowski explains. “If you were to step in front of the vehicle, it would have just run you over.”

Kalanick began courting Levandowski this spring, broaching the possibility of an acquisition during a series of 10-mile night walks from the Soma neighborhood where Uber is also headquartered to the Golden Gate Bridge. The two men would leave their offices separately—to avoid being seen by employees, the press, or competitors. They’d grab takeout food, then rendezvous near the city’s Ferry Building. Levandowski says he saw a union as a way to bring the company’s trucks to market faster. 

For his part, Kalanick sees it as a way to further corner the market for autonomous driving engineers. “If Uber wants to catch up to Google and be the leader in autonomy, we have to have the best minds,” he says, and then clarifies: “We have to have all the great minds.”

Astronauts Are About to Install a Parking Space for SpaceX and Boeing

As reported by Popular Mechanics:Starting in 2017, Boeing and SpaceX will become the first private companies to send NASA astronauts into orbit. When that happens, the International Space Station is going to need a new space parking space. So, on Friday, astronauts Jeff Williams and Kate Rubins will venture outside the ISS to finish installing a new docking adapter.

Installing these adapters is a necessary step in NASA's Commercial Crew Program, which seeks to spur development of commercial crew spacecraft. Since the space shuttle was discontinued in 2011, NASA has had to pay Russia million to take its astronauts to the ISS. The Commercial Crew Program will give NASA cheaper options in crewed spaceflight.

The spacewalk is scheduled to begin at 8:05 a.m. on Friday, and live coverage will start at 6:30. This will be Williams' fourth spacewalk, and Rubins' first. Here is a video describing exactly what the spacewalk will entail:



Wednesday, August 17, 2016

Beartooth Turns Your Smartphone Into An Industrial Radio

As reported by Gadget ReviewAll a smartphone is, is a radio with a computer bolted to it. Seriously. That’s it; it’s just radio waves, the same radio waves pumping Today’s Classic Hits through the atmosphere. Of course, smartphones aren’t on the same channel as professional, industrial radios… but Beartooth makes it possible for them to get on those frequencies.

A Radio On Your Radio

Essentially, Beartooth is a radio you attach to your smartphone, and which interacts with your smartphone in various ways using publicly available radio bandwidth. It’s not dissimilar to the GoTenna in that respect, but Beartooth adds a few elements, especially in the physical realm, which make it much more interesting to the outdoorsy and those who just like fiddling with microwave radiation.

The Citizen’s Band, Improved

Essentially, it’s a CB radio with all the features you want from a smartphone. For example, you can have one-to-one calls, or group in more people, although they will need a Beartooth to participate. If you’ve got a friend you want to send a text message to, that’s as easy as using your phone. There’s even a confirmation of message receipt built into the phone, and you can also send out an SOS signal if you need it, making this crucial for campers and outdoorsy types. Also useful is the fact that another battery is built into the case, doubling the life of your phone.
The Great Outdoors
The simple fact of the matter is that if you’re going outdoors and getting out of contact range, you should probably have a radio, signal beacon, or other tool available to reach help if you get stuck. So, just by that standard, the Beartooth is a useful tool to have in your camping gear. And, also, it means you can send text messages to your friends when you’re waiting for the food to cook, so that’s useful time spent right there.

With Walabot, Your Phone Can "See" Through Walls

As reported by FastCompanyIn the first demo, Raviv Melamed, CEO and cofounder of Vayyar and Walabot, uses the camera on his phone to see through our conference room table and detect the number of fingers he's holding up beneath the surface. Next, there's a video of a person walking down a hall and moving behind a barrier; the technology senses the human form even though it's no longer visible to our eyes. Then comes a clip of vodka being poured past a couple of sensors to determine the purity of the alcohol on a microscopic level.

This superhero-like X-ray vision comes courtesy of a new microchip-based, radio frequency sensor technology. It can be used to analyze and create 3-D images of pretty much anything behind or inside objects (the only thing it can't "see" through is metal). Radio frequency tech has been around for decades; what makes this chip innovative is that it instantly transforms RF waves into digital output. Radio waves emitted from the chip sense how much of the signal is absorbed by an object in its path. Algorithms can then be applied to the digital data translated from the RF signals and determine all kinds of information about those objects: their density, dimensions, and using software, what those objects actually are.
Though it's not the first technology capable of turning RF into digital, it's certainly among the smallest and least expensive—the sensor could fit inside a mobile phone. Vayyar, which created the chip technology and is selling Walabot, is based in Tel Aviv with 36 employees, and has raised $34 million in funding. It's part of the rapidly evolving imaging technology sector that includes the handheld ultrasound device from Butterfly Network Inc., the space-mapping camera fromMatterport, and the security scanner that can detect weapons underneath clothing from the U.K.-based company Radio Physics.
The first application of Vayyar's chip is medical: It's being developed to detect tumors in breast tissue. Since it can be produced at a fraction of the cost, and physical size, of today's solutions, it potentially makes breast cancer screening accessible and affordable to people around the globe.

What else could it be used for? That's where you come in. Walabot is being released publicly in April so that robot makers and hardware tinkerers can build their own apps for Android, Raspberry Pi, or most any other computer with a USB connection. "Why limit the technology for one startup when you can actually go and allow other people to innovate?" says Melamed, a former Intel executive and Israeli Defense Forces engineer.

Walabot has seemingly endless potential applications. It could be used to analyze your breathing while you sleep, or examine root structures in your garden, or track the speed of cars racing past your house. And when it comes to video gaming, Melamed says this technology is far more accurate than any other single motion sensor currently on the market. It could help untether VR headsets by pairing with sensors placed on the body—perhaps simple bands around players' arms and legs.

Melamed uses the example of a simple virtual ping-pong match. Right now, the only moving body parts would be the head and a hand, since that's all that can be tracked. "I want to see your body, I want to see your movement, right?" says Melamed. "You have those other technologies, like accelerometers—the problem with accelerometers is they drift. What we can do with this technology is actually put several sensors on your body and track your body in a room like 5 meters by 5 meters, to the level of a centimeter, and now this is a totally different kind of feeling, I can actually see your limbs and we don't drift."
Of course, accelerometer-based technologies like the Gear VR, Oculus Rift, and Google Cardboard have all addressed and continue to minimize the drift issue by applying other sensor-based technologies to their processes—and there are other companies on the market that are attempting to bring the full-body experience to VR via sensors placed on the body. The difference is that RF technology can be deployed for virtual reality pretty successfully without the aid of other devices like accelerometers or magnetometers.
How does Vayyar's technology differ from something like the Kinect? Well, for one thing, the Kinect is primarily a camera-based optics system, while Vayyar's system is radio frequency based. Kinect works pretty well in the dark, but radio frequency works without any light. For another, the Kinect offers 30 frames per second, while Melamed claims that Vayyar's technology can process 100 frames per second.

This breakthrough in imaging tech is what can happen when a successful executive moves on to the next chapter. Melamed was the vice president of Architecture and General Manager of Mobile Wireless at Intel back in 2010, when he asked himself: What's next? He began looking at medical imaging, which still relies on technology that was developed before mobile computing made chips and sensors low cost and lightweight.
"I started to ask the questions, 'How come there is no simple, easy-to-use modality that you can bring to people instead of these big machines that cost so much money?" he recalls. "Breast cancer was a huge problem to solve and a big market, so a good combination," he adds (it's also an illness that touched his life personally when his mother developed the disease). Melamed decided to apply expertise from both his work inside the Israeli Defense Force 20 years ago and his time at Intel working with high-end communication chips to build a better detection system for breast cancer—and soon realized that the radio transmission-based technology he was developing could be applied to a range of industries and problems. Vayyar was born.



In the lead-up to its April launch, the Walabot team is giving the technology to a handful of leading makers and holding internal hackathons, and studying how people play with the Vayyar chip. "The whole point is to start a community around it and have people kind of play around with it and develop around it," Melamed says, adding that he hopes developers will share their application code with each other.
Melamed says he believes great innovations occur when people find ways to apply technological breakthroughs in one industry to another—and thinks his work on Vayyar, which combines his experiences in the digital communications world with radio frequency technology, is a prime example of that. "I think a lot of the breakthroughs came when people took one technology from one industry and implemented it in a totally different industry, and that is basically what we are trying to do here," he says. "I went and looked at what people did in the past, and for the last 30 years people are trying to do breast cancer imaging or any other imaging with radars with radio frequencies—but they kept bouncing into the same problems," Melemed says. "With the architecture we are using in communication and the things that we are able to do over there, bringing technology from that world into this world, that's kind of created that breakthrough, the ability to put so many transceivers and such a high-end kind of technology into a small silicon.



Walabot will be sold in three models—each a bit more powerful than the next—and cost between $149 and $599. It started shipping in April 2016 when the public developer API also became available.

Monday, August 15, 2016

Audi Cars Will Start Talking to City Traffic Systems This Fall

As reported by EngadgetIf you've ever been stuck at a red light that seems to last an eternity, you'll be happy to know that Audi announced that it's going to start work with municipalities to tell its cars when a light is about to turn green. The automaker says this is the first step in a Vehicle to Infrastructure (V2I) partnership with cities that will be launching this fall.

The Audis won't be talking to the traffic lights directly, instead the vehicles will use their built-in LTE connection to get information from a participating city's central traffic control system. Using that data and GPS, the cars will be able to show on the dashboard when an upcoming signal will turn green.



The system does not use the upcoming DSRC V2V (vehicle to vehicle)/ V2I (Vehicle to Infrastructure) standard. Instead it uses partner Traffic Technology Services to establish a data relationship with the municipalities. As a vehicle enters a "zone" it requests a one-time unique token to establish communication with the infrastructure to request the stop light phase.

As for DSRC, Audi product and technology communications senior specialist, Justin Goduto said it's not quite ready for widespread deployment yet. Audi wants to move forward now. "For the time being using this methodology gives us true integration to the infrastructure," Goduto said.

The technology needed to get all that green light information is available in 2017 Audi Q7, A4 and A4 Allroad vehicles built after June 1, 2016. Drivers will also need to subscribe to Audi Connect Prime. As for the cities, the automaker isn't ready to announce where the V2I infrastructure will roll out first. But it hopes the system will be working in five to seven metropolitan areas by the end of the year.

SpaceX Nails a Tricky Fourth Rocket Landing at Sea

As reported by Engadget:SpaceX is good enough at sea-based rocket landings that they've nearly become commonplace. The private spaceflight outfit has successfully landed a Falcon 9 rocket aboard a drone ship for the fourth time, or its sixth landing overall. And this wasn't a particularly easy trip, either. On top of the inherent challenges of a sea landing, the destination for the rocket's payload (the JCSAT-16 communications satellite) meant that the vehicle had to contend with both "extreme velocities" and high re-entry heat.

No, SpaceX still hasn't reused a rocket yet -- that's happening in the fall. However, the touchdown suggests that the company might just meet its objective of launching a rocket every two weeks by the end of 2016. There are no guarantees that it'll land every time (just ask SpaceX what happened in June), but the success rate is now consistent enough that Elon Musk and crew can expect that rockets will return intact.



First stage landing confirmed on the droneship. Second stage & JCSAT-16 continuing to orbit http://www.spacex.com/webcast