The Chinese-based and -backed startup Byton was also back at CES this year, once again showing off its own vehicle concept that similarly combines the traditional automotive control scheme (a steering wheel) with the future (a massive touchscreen). Byton’s massive screen wraps around the entire front of the dashboard, looking from many angles like it obstructs crucial outward visibility for the driver. Byton says that the screen aligns with the sight lines out of their vehicle so it doesn’t physically disrupt outward vision for the driver, but the fact remains: that thing is going to be distracting as heck.
In the past Byton has said that its vehicles would use a Level 3 automated driving system to start and would eventually roll out Level 4 autonomy as it becomes available, but its AD stack developer partner Aurora is only working on Level 4 and (wisely) refuses to say when its system will be ready. This puts Byton in a situation that exemplifies where the entire space is: they want to load up their cars with screens in order to enable a whole digital ecosystem that will make a far better business model than making and selling cars, but the autonomous capabilities aren’t ready to enable that yet.
With any luck, this will be the last CES where we see so many steering wheels facing huge screens that will be useless at best without full autonomy of some kind and dangerous at worst. For now, this bizarre juxtaposition is a symbol of the weird position that the future of cars finds itself in and a reminder that distracted driving is one of the most insidious threats on the road. More displays that can route valuable safety information to drivers in order to keep them safer is one thing, but until true human-out-of-the-loop autonomy matures we need to remember that giant screens are the enemy of safe human driving.
The “other autonomous vehicle sensor” takes a bow
At an event that tends to focus on “game changers” and ambitious new efforts, sometimes the most interesting developments are the little ones that nobody seems to pay attention to. One of those widely-overlooked debuts was the sudden appearance of thermal imaging cameras on the beautifully-packaged sensor suite of the Toyota Research Institute’s P4 autonomous test vehicle.
TRI’s press conference was workmanlike and hype-free, in line with the new industry mood, with updates on its Level 4 “Chauffeur” self-driving system as well as its “Guardian” augmented driving concept. It’s interesting that two years after the announcement of Guardian TRI remains alone in developing an automated drive system designed to keep a human driver safe rather than removing them from the loop, and based on some of the questions at TRI’s press conference and subsequent round table a number of people still don’t fully understand the concept (yet another educational challenge for PAVE to tackle!). But the addition of thermal imaging cameras, from the German supplier Allied Vision, was the most concrete sign at CES of AV developers learning from the last year of tragedy and failed hype.
TRI declined to tie the inclusion of thermal imaging to its sensor stack to the tragic death of Elaine Herzberg after being hit by an Uber autonomous test vehicle last year, but it would be surprising if the two weren’t at least indirectly related. Identifying pedestrians at night and in low-level conditions is the core application for thermal imaging devices, and as Alex Roy has pointed out there’s little doubt that a good thermal system could have at least given the Uber car a better chance of detecting Herzberg and avoiding her. Since Toyota is partnering with Uber on a new fleet of autonomous vehicles in which the Guardian system will keep a watchful eye on Uber’s self-driving stack, hopefully, the use of thermal cameras in the TRI sensor suite will help prevent any future tragedies.
TRI CEO Gill Pratt said that the inclusion of thermal imaging was part of TRI’s ongoing effort to diversify its sensor stack and that it was crucial to helping avoid weird edge cases such as a vehicle camera detecting a human that was actually an image on a billboard or the side of a vehicle. By recognizing heat signatures, TRI’s car can now detect the difference between a graphical representation of a human and an actual person, improving its object classification and eliminating one way its sensors could be fooled.
Micromobility shows off success, hints at growing pains
Scooter giant SegwayNinebot was feeling its oats at CES this year, having sold some 1.5 million electric scooters last year. With all the talk about 2018 being the “year of the scooter,” the Chinese manufacturer is looking forward to another strong year of sales to shared micromobility platforms like Bird, Lime, and Uber, as well as diversifying into more consumer micromobility products and even delivery robots. Still, CES seems to be lagging the developments in mobility technology, leaving SegwayNinebot in the back of Hall 2 among drones and robots to which its products may be related but whose business it has clearly outgrown.
Interestingly, SegwayNinebot’s newest shared-fleet scooter is a lot more robustly made than the scoots currently found on apps like Bird suggesting that some of these scooter operators are having to replace scooters faster than many understand. With larger wheels and tires and extra heft, ride quality will likely improve especially in places with rougher roads (like Detroit, where I took my first Bird ride) but keeping these assets in use long enough to pay them off is clearly one of the major motivations behind the new generation of beefed-up scooters.
Whether the scooter trend keeps booming and SegwayNinebot sells out of its shared scooters again this year remains to be seen, and replacing fragile scooters may well be good for the manufacturer but not for the sharing companies. In any case, consumer goods seem to be the focus, as scooters are blossoming into a range of electric skates, one-wheels, dirt bikes, go-karts and other electric mobility devices ranging from fun to functional.
With shared platforms opening consumers minds and getting the public experimenting with new micromobility concepts, it will be interesting to watch how private demand will be affected. Not only do companies like Bird and Lime have to deal with vandalism, theft and worn-out scooters, but they also run the risk of marketing these products to people who will end up buying whichever devices are most useful for their lifestyle. If 2018 was the “year of the scooter,” 2019 could see some of these challenges hitting home for shared fleets even as sales boom for privately-owned micromobility and SegwayNinebot’s distributors start taking up more of their production capacity.
Suppliers inching closer to OEM status
Outside the auto industry it’s easy to forget about the massive Tier One suppliers, companies with names like ZF, Continental, and Bosch, who make much of the technologies you see on new cars. Automakers famously lean on these companies to keep them competitive, giving their engineers spec sheets and wish-lists for everything from traditional gearboxes and ECUs to electric drive technology and autonomous drive “nervous systems.” These major suppliers are caught between the OEM squeeze to deliver better margins on the one hand and the drive to integrate these technologies into more value-added products on the other, with a major risk always hanging over their heads: if they start to look too much like OEMs themselves, they could risk turning customers into competitors.
One of those established players, ZF, is trying to square this circle by partnering in a joint venture with a German shuttle company called E-go. ZF suppliers the partnership with everything from an electric drivetrain to its ProAI autonomous drive central ECU, making the firm’s new shuttles a ZF product in everything but name. Integrating and manufacturing all this equipment at a brand new startup shows the world just how easy it is to pull together ready-to-roll components from ZF’s massive operations while keeping the project at arm's length.
A newer player in the space, NVIDIA is taking a different approach by burrowing deeper into the Mercedes-Benz future product portfolio by partnering with the German OEM on next-generation smart vehicles. Mercedes already buys a huge number of NVIDIA processors for its MBUX interior concept, and this next-generation partnership will see the Silicon Valley company at the heart of both its driver/passenger-facing capabilities as well as its automated driving system. Like ZF’s ProAI, the NVIDIA-Mercedes partnership aims at the holy grail of replacing the constellation of microcontrollers in a modern car with a single, centralized computing system that opens a range of new options for increasingly defining car functions in the software rather than the hardware realm.
The coolest thing I learned during CES
At a well-attended party thrown by The Autonocast (which I co-host, check it out!), Dr. Anna Newberry of Cruise Automation told me that an ADAS developer working in Australia had tried to create a prediction model for kangaroos in order to build an automated emergency braking system for the Antopdean market. Apparently, nature’s most hilarious creation bounces about with such a lack of predictability that this not-insignificant effort failed, leaving Australians to their own devices when it comes to these long-tailed obstacles. But don’t worry: a new company has taken on the challenge, meaning we could one day understand the patterns underlying kangaroo motion and make cars that can avoid them. In the meantime, stay safe out there mates!
Source : http://www.thedrive.com/tech/25970/ces-was-boring-this-year-and-why-thats-okThank you for visit my website