Search This Blog

Tuesday, September 12, 2017

GM Boasts “The World’s First Mass-Producible Driverless Car”

As reported by FastCompany: General Motors has unveiled the third version of its self-driving car–the first such car “that meets the redundancy and safety requirements we believe are necessary to operate without a driver,” Kyle Vogt, CEO of the General Motors-owned self-driving car startup Cruise, wrote in a blog post. “There’s no other car like this in existence.” And, he says, it’s the first car that’s ready to be produced at scale once the software and regulations are in place.
  • Vogt says the company plans to add these vehicles (with humans at the wheel) to the on-demand fleet that caters to Cruise employees in San Francisco “in a few weeks”–though there’s no timeline for mass production at GM’s Lake Orion, MI plant.
  •  The car “has airbags, crumple zones, and comfortable seats. It’s assembled in a high-volume assembly plant capable of producing 100,000’s of vehicles per year, and we’d like to keep that plant busy.”
  • “Unlike the previous generations, which were similar to Chevrolet Bolt EV design, the vehicles we’re unveiling today have almost completely new and fault-tolerant electrical, communication, and actuation systems that are unique to a driverless vehicle.”
The company has so aggressively positioned itself as a purveyor of self-driving technology that confused customers have given over too much control to their vehicles. In 2016, a Tesla Model S drove into the side of a tractor trailer as it was making a left turn. The driver had autopilot engaged and did not have his hands on the wheel. The National Transportation Safety Board is suggesting that the autopilot feature contributed to the crash, according to Bloomberg News.


Such a finding will be another blow to the safety reputation of self-driving technologies, and possibly impact the legislation Congress will use to spur autonomous vehicle systems, but it also creates an opening for other companies to assert their dominance in the race toward safe self-driving cars. GM’s advantage has always been its infrastructure for designing and building cars. That holds true, regardless of whether its new generation of autonomous vehicles are at or above the level of its competitors.

Monday, September 11, 2017

Why Self-Driving Vehicles Need Superhuman Senses

As reported by Wired: More than any other benefit, self-driving vehicles promise to save lives. Cutting out the human error that causes 90 percent of crashes could start to save some of the 35,000 lives lost on American roads every year. Manufacturers are convinced that people will happily use at least partially autonomous cars when they’re proven to be safer than human drivers, but that’s a pretty low bar. The ultimate goal is to eliminate crashes all together, and to do that, cars will need to perfectly perceive and understand the world around them—they'll need superhuman senses.

Pretty much every AV now in testing uses some combination of cameras, radars, and lidar laser systems. But now, an Israeli startup wants to add a new tool to the mix: heat-detecting infrared cameras that can pick out pedestrians from hundreds of feet away.

A fully driverless car, after all, will need to see the world in a wide variety of lighting and weather conditions. “Existing sensors and cameras available today can’t meet this need on their own,” said AdaSky CEO, Avi Katz, in a statement. So this morning, his company announced its plan to offer automakers what it calls Viper, a long distance infrared camera and accompanying computer vision system.

Today's sensors offer a detailed view of the world in 360 degrees, but each has its weak points. Cameras don’t work well at nighttime, or in dazzling sunlight. Lidar has trouble with rain, fog, and dust, because the laser bounces off the particles in the atmosphere. Radar can be confused by small but highly reflective metal objects, like a soda can in the street.

Even systems that combine data from all three sensors can struggle with images of humans on billboards, or on adverts on other vehicles, as recently shown by Cognata, which simulates training environments for driverless car brains. That’s where AdaSky thinks its sensor can pitch in. If a human-shaped object is giving off heat, it’s probably a real person, not a picture.

“When we have heat radiating from something, and you figure out that it’s a person or an animal, then that tells you there’s the potential for unpredictable behavior,” says Jeff Miller who studies autonomous vehicles at USC. Instead of just knowing there’s an object on the right hand side of the road, a car that perceived that it was a deer would proceed more cautiously.
(The system might even help cars figure out those kid-shaped, terribly-dressed bollards that recently showed up in the UK.)
Perception isn't just about sight, or even the outside world. Waymo, née Google’s self driving project, recently announced it now uses upgraded microphones to listen for police sirens, for example. Cadillac stuck an infrared camera on the steering wheel to monitor the driver’s state of awareness when its cars are in semi-autonomous mode.
This process of finding the right mix of sensors will likely never stop evolving, as new technologies become available and car companies puzzle over factors like cost, availability, and durability. Because until the day that cars are 100 percent safe, passengers will have to be convinced that the vehicle they’re climbing into is at least better at handling any situation than a human driver. And when it comes to that making cars superhuman, the better they see, the better they drive.

Tesla Extended the Range of Some Florida Vehicles for Drivers to Escape Hurricane Irma

As Hurricane Irma bears down on Florida, Tesla issued an over-the-air update to drivers in the state that unlocks the full battery capacity of its 60 and 70 kilowatt-hour Model S and X vehicles. The update provides those trying to escape the path of the storm with an additional 30 to 40 miles above the typical range of the vehicle, according to Electrek.

Tesla’s 60 and 60D vehicles offer a range of just above 200 miles on a charge. Faced with an order to leave, one Tesla owner contacted the company, saying that they needed an additional 30 miles of range to get out of the mandatory evacuation zone they were in. In response, the company issued an update to other drivers in the state, providing them with the full 75 kWh capacity of their vehicles through September 16th. One driver posted a screenshot of his app, which showed off the new extended range. A Tesla spokesperson confirmed that the company’s 70kWh vehicles also received the update.

Tesla introduced its cheaper Model X 60D and Model S 60 / 60D vehicles last year. The vehicles are equipped with a 75 KWH battery, but they are software locked to use only 80 percent of that available power. Drivers could unlock that extra capacity with an update for an additional $3,000. Tesla has since discontinued the Model S 60/60D and Model X 60D vehicles, saying that most owners were simply opting for the higher range vehicles.


Thursday, September 7, 2017

Google has Updated its Street View Cameras for the First Time in Eight Years

Better cameras means better photos and source data for Google's machine learning capabilities.
As reported by The Verge: The days of the large, globe-like cameras on top of Google Street View cars may be slowly disappearing. Google has refreshed the design for the cameras used to capture Street View images, which is its first significant upgrade in eight years, reports WiredThe new camera rig will help capture photos that are clearer, higher in resolution, and more vivid in color. Like the old design, the rig will attach to a vehicle’s roof, but the smaller ball on top now features just seven cameras (down from 15) fitted with 20 megapixel sensors. The rig also plays host to two cameras that take still HD photos, and two “cans” on the front and back for laser radar.

The main ball, with seven 20MP cameras.  There are also two Velodyne Lidar Pucks mounted at an angle.
Google’s machine learning and AI capabilities mean that when photos are captured by a Street View car, algorithms can detect and note relevant street names and numbers, automatically adding them to Google’s database. The software can also identify business names and logos. Google is working to better the software, taking its capabilities further so one day it can recognize different types of stores based on what they look like, and reading smaller signage that show details like opening hours.

The HD camera that captures building data on the left and right of the car.
Better technology should lead to better source data for Google’s machine learning capabilities, which will allow it to expand Google’s search and Assistant functions. In an example to Wired, Google’s vice president and head of its mapping arm Jen Fitzpatrick says in the future, Google Maps might be able to answer more detailed questions that reference how the world looks, such as “What’s the name of the pink store next to the church on the corner?” Those questions can only be answered if “we have richer and deeper information,” she says.

Google Street View has been taking photos since 2007, and its fleet of Street View vehicles include cars, snow mobiles, trikes, and a trolley that captures photos indoors in places like museums. The camera rig has even been outfitted onto a backpack to take images of places that aren’t easily reached by vehicle.

We’ve reached out to Google for additional details on the new rig’s rollout.



SpaceX Just Launched Its “Secret Mission” for the U.S. Air Force

At approximately 10:00 a.m. ET today, a SpaceX Falcon 9 rocket launched from the Kennedy Space Center carrying the X-37B space drone and what might be one of the most mysterious payloads ever. The Falcon 9 took off right on schedule, despite initial concerns over weather conditions.
Several minutes later, the Falcon 9 also successfully landed back on Earth, though this time not on SpaceX’s floating barge but rather on a landing zone prepared at the Cape Canaveral Air Force Station. This is the 10th successful Falcon 9 landing ever, and the reusable rocket’s 41st flight to date.

As far as this mysterious payload, it’s a host of firsts for SpaceX. While Elon Musk’s rocket venture has flown missions as part of government contracts before — like its resupply missions for NASA — today’s launched was SpaceX’s first mission for the U.S. Air Force. On June of this year, SpaceX secured a contract from the U.S. Air Force to ferry the X-37B drone into orbit, marking the fifth mission for this unmanned Orbital Test Vehicle (OTV) and the first one that it’s accomplished using a private space company’s rocket. It’s also the first time the X-37B was launched on a vertical platform.

Not much is known about the missions of the X-37B OTV, however, except of course what the U.S. Air Force chooses to reveal about it. Aside from this, we know that the space drone has been gradually extending its time in orbit for each of its missions. This particular mission, for instance, hosts the Air Force Research Laboratory Advanced Structurally Embedded Thermal Spreader [or ASETS-11], instruments designed for prolonged space orbit.


Wednesday, September 6, 2017

Lilium Secures $90 Million to Develop its Electric VTOL Plane

Like a few other startups, Lilium wants to make our flying car dreams come true with an electric VTOL craft you can summon with an app. Thanks to $90 million worth of new investment from China's Tencent and others, the startup may now have a leg up on its rivals. It will use the funds to drastically expand hiring in order to take the electric jet into the next stages of development.

Lilium differs from Ehang's passenger drone and other like concepts. Rather than using multi-rotors to both lift and propel it, the craft uses flaps with electric "jets" that rotate from a vertical to horizontal position. By tilting them into a vertical position, it can take off like a helicopter, than rotate them horizontally to transition into conventional flight. That's similar to how Boeing's V-22 Osprey works, for instance, but with many more engines. It also works the same way as NASA's smaller-scale "Greased Lightning" VTOL craft.

Lilium is backed by the European Space Agency (ESA) and has actually flown a full-sized, two-seat prototype, albeit with nobody in it (below). It recently hired engineers and employees from Gett (an Uber and Lyft ride-sharing rival), Airbus and Tesla. The company plans to build a five-seat "air taxi" that could ferry passengers 186 miles at around 186 mph. It has ambitiously planned its first manned tests by 2019, and passenger flights by 2025.


It's going to be tough for the the company to get aircraft approved in the US, however. FAA certification is notoriously difficult, for one thing, especially for an all-new type of aircraft. There's also no current battery technology that can give Lilium the range it wants, and VTOL requires much more energy than regular airplanes. Elon Musk -- who has designed his own electric plane, of course -- said the battery density threshold is about 400 Wh/kg, compared to around 250-300 Wh/kg available in current Tesla models.

However, Lilium seems confident it can overcome those problems via its efficient "electric jet" engines, lack of a tail and other technological innovations. If it can pull it off, it would be a grand feat, and the smog-free craft could definitely revolutionize urban transport. Hopefully, it's painting a realistic picture for investors, because 2025 is just eight years away, barely an eyeblink in the world of aircraft development.




Tuesday, September 5, 2017

Jaguar's Steering Wheel of the Future Revolves Around AI

As reported by Engadget: The steering wheel as we know it doesn't have a bright future -- in fact, it might disappear altogether as self-driving cars hit the road. Jaguar Land Rover, however, has an idea as to how it might survive. The British automaker has unveiled a concept steering wheel, Sayer, that's designed for an era where cars normally drive themselves and personal ownership is a thing of the past. The wheel would have its own AI system, and would follow you from car to car -- you'd just hook it in to bring your experience with you.

The AI would largely serve as a concierge. It would link you to an on-demand service club, whether or not you own your car, and would help you get a ride when and where you need it. If there's a must-attend meeting, for example, you could tell the wheel while it's still in your living room and it would figure out when a car needs to arrive and tell you when you might want to take control.

Sayer (named after influential designer Malcolm Sayer) will be a core feature on an upcoming concept car, the Future-Type.

Will something like this wheel ever reach production? Probably not. Jaguar Land Rover is making a few assumptions about self-driving cars, such as the likelihood that you'll have a steering wheel and the need to integrate AI into a dedicated device. Your phone and a cloud service might be all you need. Instead, we'd treat this as a thought exercise. It might never come to pass, but it could give engineers something to consider when they design the first wave of autonomous vehicles.