Vehicle-to-everything (V2X) 4G and 5G connectivity via small cells can be a lifesaver.

Carsofthefuture.co.uk editor to host Automotive & Transportation session at Small Cells World Summit 2021

Carsofthefuture.co.uk has signed a media partnership agreement with The Small Cell Forum (SCF) for its three-day online Small Cells World Summit, The Future of Mobile Networks, on 11-13 May 2021.

Small Cells World Summit 2021 registration
Small Cells World Summit 2021 registration

As part of the deal, Carsofthefuture.co.uk editor Neil Kennett will moderate the Automotive & Transportation session from 11am on Wednesday 12 May, with high-profile speakers including: Peter Stoker, Chief Engineer for Connected and Autonomous Vehicles at Millbrook Proving Ground; Dr Maxime Flament, Chief Technology Officer at the 5G Automotive Association, one of the world’s leading authorities on Intelligent Transport Systems (ITS); Bill McKinley, Connected Car Business Lead at Keysight Technologies; and Mark Cracknell, Head of Connected and Automated Mobility at Zenzic, responsible for accelerating the self-driving revolution in the UK.

Neil Kennett, said: “We are delighted to partner with The Small Cell Forum for this exciting virtual event, which brings together mobile operators, vendors and regulators from around the globe. The Automotive & Transportation session will focus on connected and autonomous vehicle (CAV) opportunities, particularly vehicle-to-vehicle (V2V) and vehicle-to-everything (V2X) communications, in-vehicle payments, and the rival ITS-G5 and C-V2X 5G technologies.

“Small cells deliver high-quality, secure 4G and 5G coverage, so there are clearly a multitude of new use cases in the connected car world and the wider mobility ecosystem. Aside from supporting self-driving, they can facilitate everything from in-car infotainment and shopping, to fixing technical problems before they occur and pre-empting likely crash scenarios. It is no exaggeration to say they could be a lifesaver.”

Carsofthefuture.co.uk readers can benefit from a 40% discount on Small Cells World Summit 2021 tickets using the code SCWS2021. See www.smallcells.world/

CGA’s simulations train autonomous vehicles to deal with environments specific to the UK.

Self-driving and smart cities: stop wishcasting and get real with predictive simulation

Our Zenzic CAM Creator series continues with Liverpool-based Jon Wetherall, Managing Director of CGA Simulation, and Max Zadow, Director of Future Coders.

By applying gaming knowledge to real-world mobility questions, CGA has created engaging simulations to study autonomous driving and smart city solutions.

JW: “My background is gaming. I used to work for the company that did Wipeout and F1 games. We made a racing game called Space Ribbon and one day, about five years ago, we got a call from The Department for Transport (DfT). They were doing a research project on virtual reality (VR) in the testing and training of drivers, specifically hazard awareness.

“We turned it into a game and it worked – people said their attitudes changed as a result of our simulations. The hardest scenario came early in the game – a parked lorry with a big blind spot – and a lot of people crashed. VR feels so visceral, the experience can be quite vivid and shocking. Of course, smarter cars will hopefully fix these types of situations.”

CGA Simulation junction and forecourt
CGA Simulation junction and forecourt

To pursue this goal, CGA received a grant from Innovate UK to create an artificial learning environment for autonomous driving (ALEAD).

JW: “The aim was to make these cars safer and we stayed true to our computer game history. We didn’t have the resources to lidar scan the whole area, so we did our own thing using mapping data. We made a digital twin of Conwy in north Wales and unlike other simulations we kept all the ‘noise’ in – things like rain. This was important because it is now well-understood that noise is a big challenge for autonomous vehicles (AVs).

“Modern autonomous driving stacks have 20 different subsystems and we generally focus on only one or two, to do with perception. There’s been massive progress in this area over recent years, to the extent that artificial intelligence (AI) can identify an individual by their gait. What’s more, you can now do this on a computer you can put in a car – this is one of the cornerstones of driverless.

“It’s not the first time people have been excited about AI. In the 50s they were saying it was only a few years away. It has taken much longer than people thought, but major problems have now been solved.

“We are lucky to have one of the world’s leading experts in radar on our doorstep, Professor Jason Ralph of The University of Liverpool, and he helped us develop the simulation. You have to feed the car’s brain, a computer, all the information it will need – from sensors, cameras, GNSS – and you can do all that in the software.”

MZ: “In particular, The University of Liverpool were interested in how weather affects things, right down to different types of rain and mist. In California, if an AV encounters conditions it can’t handle, like heavy rain, it pulls to the side of the road. That’s ok for San Francisco but not for Manchester!

“A few years ago, everyone seemed to be using the example of an AV encountering a kangaroo. How would it cope? The point is you can use our simulations to train cars, to create algorithm antibodies for once in a lifetime events and regular things in different environments. That remains an essential part of what’s needed to make AVs a reality.

“We picked Conwy partly because it has very different patterns of land use to America. An early use case for AVs is predicted to be taxis, but in the UK these are most frequently used by people who don’t own their own car, and they often live in high density housing or narrow streets. The operational design domains (ODDs) are going to have to deal with environments specific to this country – steep hills, roads which twist and turn, and changeable weather.”

Mobility Mapper

Wetherall and Zadow’s latest collaboration is Mobility Mapper, a project to create greener and more intelligently designed transport hubs. The technology underpinning Mobility Mapper has been used previously by the team to model Covid 19 spread, autonomous vehicle technology and by the Liverpool 5G Create project (funded by DCMS as part of their 5G Testbeds and Trials Programme).

JW: “E-hubs are basically an extension of what used to be called transport hubs – train or bus stations. They’ll provide charging facilities and access to different modes of transport, for example, you can drop off an e-scooter and hop into a shared autonomous car.

“Here in Liverpool, there was a big trial of e-scooters, big in international terms not just UK. The worry was that a lot of them would end up in the canal, but that didn’t happen. The trial was incredibly successful. It’s all about linking that movement and nudging people away from car ownership.”

MZ: “We were already thinking about how Jon’s technology could be used for mobility as a service (MAAS) when we attended a virtual future transport conference in LA with the Centre for Connected and Autonomous Vehicles (CCAV).

“That was an influence, as was an Intelligent Transportation Systems (ITS) trade show in Copenhagen, where we saw an autonomous tram system designed to take bicycles. It was a small step from there to imagining autonomous trams carrying autonomous delivery pods.

“This is classic smart city stuff but you need to know how these e-hubs are likely to be used, with no track record, nothing to go on. We need simulated environments to make best guesses in. That’s Mobility Mapper.”

JW: “It is early days, still in the development phase, but the authorities in both Manchester and Liverpool have agreed there’s a need for such a predictive simulation tool.”

As we wrap-up a thoroughly enjoyable interview, Max dons his Director of Digital Creativity in Disability hat: “Autonomous delivery bots are basically electric wheelchairs without a person, so there’s clearly a potential benefit, but there needs to less wishcasting and more real work on how accessibility will be affected.”

For further info, visit CGAsimulation.com

The key to the future of self-driving is education, education, education, says Millbrook’s Stoker.

On track and in virtual space, Millbrook tests cars of the future

Our Zenzic CAM Creator series continues with Peter Stoker, Chief Engineer for Connected and Autonomous Vehicles at Millbrook.

Part of CAM Testbed UK, Millbrook Proving Ground in Bedford boasts 700 acres of private roads on which to develop and test connected and autonomous vehicle (CAV) technologies. As Chief Engineer, Peter Stoker is right at the forefront of self-driving in the UK.

Peter Stoker
Peter Stoker, Chief Engineer for Connected and Autonomous Vehicles at Millbrook

Please can you outline Millbrook’s work on connected and automated mobility?

“My primary role is to bring focus to two testbeds, our CAV testbed and our 5G testbed. We are not a purpose-built CAV testbed – we have safety, propulsion and conventional vehicle test facilities too – so CAV is something we’ve blended into the existing business.

“For the CAV testbed, we partnered with the UK Atomic Energy Authority (UKAEA), particularly the Remote Applications in Challenging Environments (RACE) division, to provide a controlled urban environment. We have three open source StreetDrone vehicles and miles of track with targets for very precise measurements, accurate to 1-2cm. We offer safety driver training and also have a simulation environment for driver-in-the-loop and hardware-in-the-loop testing. The whole idea is to fail in private, not in public, and to progress, to evolve out of the testbeds and on to open roads.

“The 5G testbed is a completely separate consortium, backed by the Department for Digital, Culture, Media and Sport (DCMS). We have 59 masts looking at all types of connectivity and I’d say the millimetre wave at 70GHz is currently the most interesting.”

Millbrook graphic
Millbrook Proving Ground graphic

What major shifts in UK road transport do you expect over the next 10 years? 

“Getting the crystal ball out, I see increased use of connectivity in existing vehicles and some very interesting new use cases – buses connected to city networks, video analytics from cameras, smart ambulances streaming live data, autonomous deliveries on campuses. What I don’t see within 10 years is millions of privately owned driverless cars. That will start in the luxury sector but to begin with it will be more about transporting goods.”

How do you see the testing framework for CAVs developing?

“There’s a lot of simulation in the automotive world – crash testing, fatigue testing, computational fluid dynamics. These days, manufacturers are developing whole vehicles before building a prototype. You have to have a good simulation on a good simulator and there’s an interesting shift that needs to happen on regulation. It’s early days on that, but it’s essential.

“The strength of virtual space is that you can run hundreds of scenarios in machine time – not only set up complicated scenarios that would take days with real cars, but actually speed up the process so it runs faster than real time. The national scenario database is already really good and regulation will move to being a mixture of real and virtual certification – global, European, UK and perhaps even city-specific. We are happy to advise, but don’t set policy.”

What are the biggest challenges in the shift to self-driving and how can these risks be mitigated?

“The key to the future of self-driving is education, education, education – for everyone, the public, vehicle manufacturers, the aftermarket, recovery operators. We have to work on the terminology – autonomous, driverless, CAV, CAM – it’s confusing, even to people who know what they’re talking about.

“At the moment, we’re making it harder to understand, not easier. We’re in a really grey area of transition with different trade names for systems. There’s a lot of groundwork needed to prepare people and, for example, the brilliant website mycardoeswhat.org does a great job of trying to explain it.

“If you get into a hire car, you need to have the right expectation of what it does and what it doesn’t do. If you buy a new car, you should read the manual, but how many people do? Especially with Covid, more cars are being delivered with minimal interaction – it’s a case of “there’s the key, where’s the station?”. Too often, the customer handover just isn’t there.

“How are garages, the aftermarket and the amber light sector going to deal with all this? Basic questions like how do you put it in neutral? ADAS has already led to huge changes in training and skill sets – how to calibrate and monitor them.

“We haven’t talked about over-the-air (OTA) updates, cameras embedded in the tarmac or even electrification – there’s a huge amount of things! How do you learn about them? Hopefully in testing rather than in crash situations.”

For further info, visit www.millbrook.co.uk

Humanising Autonomy uses behavioural psychology and computer algorithms to make cities safer for pedestrians and cyclists.

Using cameras and AI to protect vulnerable road users

Our Zenzic CAM Creator series continues with Raunaq Bose, co-founder of Humanising Autonomy.

Before establishing predictive artificial intelligence (AI) company Humanising Autonomy in 2017, Raunaq Bose studied mechanical engineering at Imperial College London and innovation design engineering at the Royal College of Art. Focusing on the safety of vulnerable road users, Humanising Autonomy aims to redefine how machines and people interact, making cities safer for pedestrians, cyclists and drivers alike.

RB: “Our model is a novel mix of behavioural psychology, deep learning and computer algorithms. We work with OEMs and Tier 1 suppliers on the cameras on vehicles, with the aftermarket on retrofitted dashcams, and also with infrastructure. Our software works on any camera system to look for interactions between vulnerable road users, vehicles and infrastructure in order to prevent accidents and near misses. While most AI companies use black box systems where you can’t understand why decisions are made, we set out to make our models more interpretable, ethically compliant and safety friendly.

“When it comes to questions like ‘Is this pedestrian going to cross the road?’, we look at body language and factors like how close they are to the edge of the pavement. We then put a percentage on the intention. Take distraction, for example, we cannot see it but we can infer it. Are they on the phone? Are they looking at the oncoming vehicle? Is their view blocked? These are all behaviours you can see and our algorithm identifies them and puts a numerical value on them. So we can say, for example, we’re 60% sure that this pedestrian is going to cross. This less binary approach is important in building trust – you don’t want lots of false positives, for the system to be pinging all the time.

“One of the main things we’re likely to see over the next decade is increased use of micromobility, such as cycling and e-scootering. At the same time you will see more communication between these different types of transportation, and also with vehicles and infrastructure. The whole point of ADAS is to augment the driver’s vision, to reduce blind spots and, if necessary, take control of the vehicle to avoid a shunt. Then there’s the EU agreement that by 2022 all buses and trucks must have safety features to detect and warn of vulnerable road users.

“We currently only look at what’s outside the vehicle, but with self-driving there will be monitoring of the cabin. In terms of privacy, we have a lot of documentation about our GNPR processes and how we safeguard our data. Importantly, we never identify people, for example, we never watch for a particular individual between camera streams. We look to the future with autonomous cars but for now we’re focused on what’s on the road today.”

For further info visit humanisingautonomy.com.

UK government sparks global business sharing transport sector data.

Sharing data collected by connected cars

Our Zenzic CAM Creator series continues with Mika Rasinkangas, founder and President of Chordant.

Originally part of the global wireless and internet of things (IoT) research company, InterDigital, Chordant was spun out as a separate business in 2019, as “a dynamic data sharing expert”. The spark was a UK government initiative to test the hypothesis that regional transportation data has tremendous value, especially when shared between different parties. The results of this two-year public-private partnership were startling.

Please can you outline your work on connected and automated mobility?

MR: “First of all we looked at the mobility space. There’s the segment that maintains the road network and their supply chain, the mobility service providers – bus companies, train operators and new entrants such as Uber – then the whole automotive sector, OEMs and their supply chain partners. We sit right in the middle of all this and our role is data exchange – bringing dynamic data sets from different sources to come up with something different that solves problems with data driven solutions.

“The hypothesis was that a lot of data in the transport segment was either underutilised, in really small silos, or not utilised at all. The idea was to work with different entities – organisations, companies and universities – to bring data together and make it more widely available, leading to innovation and efficiency.

“It was obvious from early on that this was not only a technical issue, there was a human element. Data is controlled by different entities and departments so the challenge was to get these different data owners comfortable with the idea that their data could be used for other purposes, and to get consumers comfortable with it too. The result was more usable and more reliable dynamic data.”

What major shifts in UK transport do you expect over the next 10-15 years?

MR: “Last mile transport, micromobility solutions are ballooning and Covid19 will only accelerate this. People are walking, scootering and biking more, making short trips by means which don’t involve public transport or being in close contact to others.

“In terms of automotive, we’re living through a massive change in how people perceive the need to own a car, and this shift in perception is changing the fundamental business models. Autonomous vehicle technology keeps developing, connected vehicles are everywhere already and electric cars represent an ever bigger proportion of the vehicle population. In all these segments data utilisation will continue to increase. New cars collect huge amounts of data for lots of purposes and this can be used for lots of things other than what it was originally collected for.”

Can you address the data privacy concerns surrounding connected cars?

MR: “Data privacy is a multifaceted topic. On the one hand, Europe has been at the forefront of it with GDPR. That puts businesses operating in Europe on a level playing field. In terms of connected and autonomous vehicles (CAVs), these regulations set limitations on what data can be harvested and what has to be anonymised in order for someone to use it. It fits the norms of today’s society, but you can see in social media that this kind of privacy seems less important to younger people, however perspectives vary greatly and companies need to be transparent in usage of people’s data.

“From a business perspective, we have to take privacy extremely seriously. The explosion of data usage can have unintended consequences but by and large the regulatory environment works quite reasonably.

“We typically deal with conservative entities which put privacy and security in the middle of everything – if there’s any uncertainty it’s better to not do it, is the attitude. Think of all the sensitive personal data that entities like car companies and mobile telephone companies have. It can give an extremely accurate picture of peoples’ behaviour. There are well established procedures to anonymise data so customers can be comfortable that their personal data cannot be identified.”

What are the main risks in the shift to self-driving and how can these be mitigated?

MR: “One could talk about a lot of different challenges. What about the latency in connectivity in order to ensure processing takes place fast enough? There’s a gazillion of things, but to me these are technical nuts that will be cracked, if they haven’t been already. One of the biggest challenges is the interaction between human-controlled vehicles and automated vehicles. When you add in different levels of driver assistance, urban and rural, different weather conditions – all sorts of combinations can happen.

“The UK is at the forefront of CAV testing. There are government sponsored testbeds and companies are running trials on open roads, so the automotive industry can test in real-life environments. We cannot simulate everything, and the unpredictability of interactions is one of the biggest challenges. A traffic planner once told me that in his nightmares he sees a driverless car heading toward a granddad in a pick-up truck, because there’s just no telling how he might react!”

Is there anything else you’d like to mention?

MR: “I’d like to address the explosion of data usage in mobility and how dynamic data enables not only efficiency improvements but new business models. According to recent studies by companies like Inrix, congestion costs each American nearly 100 hours or $1,400 a year. Leveraging data-driven insights can drive change in both public policies and behaviours. In turn, these can result in reduced emissions, improved air quality and fewer pollution-caused illnesses.

“CAVs can be data sources providing tons of insight. Think about potholes – new vehicles with all these cameras and sensors can report them and have them fixed much more efficiently. This is just one example of entirely data-driven efficiency, much better than eyeballing and human reporting. There will be a multitude of fascinating uses.

“Organisations such as vehicle OEMs, transport authorities and insurance providers will require facilities for the secure and reliable sharing of data, and that’s where we come in. I would urge anyone interested in data driven solutions in the mobility space to visit chordant.io or our Convex service site at convexglobal.io.”

Dr Joanna White says Highways England is currently more focused on the connected bit of connected and automated mobility (CAM).

Highways England expert predicts Level4 self-driving in towns before motorways

Our Zenzic CAM Creator series continues with Dr Joanna White, Head of Intelligent Transport Systems at Highways England.

As the body responsible for designing, building and maintaining our motorways and major A-roads, Highways England (HE) is a uniquely important player in the UK connected and automated mobility (CAM) ecosystem. Here, Head of Intelligent Transport Systems at Highways England, chartered engineer Dr Joanna White, outlines its work on CAM.

Dr Joanna White, Head of Intelligent Transport Systems at Highways England
Dr Joanna White, Head of Intelligent Transport Systems at Highways England

JW: “A key aim in improving our service is to look at how we can safely use emerging technology to better connect the country – people and places, families and friends, businesses and customers. This includes what digital channels we might use, delivering a cleaner road environment and achieving net zero carbon.

“Our connected corridor project on the A2/M2 in Kent finished 10 months ago and we are just completing the evaluation. Collaboration is vital and this was a joint project with Kent County Council (KCC), Transport for London (TfL), the Department for Transport (DfT) and others. It was also part of a wider European project, Intercor.

“We are currently more focused on the connected bit of CAM, building on the services we already provide. This includes beaming information directly into vehicles (replicating what you see on the gantries) and also what data we can anonymously collect from vehicles’ positioning sensors. Can we maintain service from one part of the network to another? Can we do it in an accurate, timely and secure way? How do people feel about it?

“We try not to choose particular technologies – whether it’s radar, lidar, cellular – we are interested in all of it. It could be 5G and, via the DfT, we work closely with the Department for Digital, Culture, Media and Sport (DCMS), which leads on that. One of the most positive government actions was the requirement for mobile operators to provide 90% coverage of the motorway network by 2026.

Highways England car interior 2
Highways England in-car upcoming junction message

“We were very proud to be involved with the HumanDrive project in which a self-driving Nissan Leaf navigated 230 miles from Cranfield to Sunderland. It was a great learning experience in how to  conduct these trials safely, underpinned by our safety risk governance. We had to identify all the risks of running such a vehicle on the strategic road network (SRN), and find ways to mitigate them. It was fascinating to see how it coped on different types of roads, kept to the lines and responded to road sign information.

“Then there’s our Connected and Autonomous Vehicles: Infrastructure Appraisal Readiness (CAVIAR) project, which has been slightly delayed due to Covid. We are building a simulation model of a section of the M1, a digital twin, and we have a real-world car equipped with all the tech which will start operating in 2021. That will collect a lot of data. This is one of our Innovation competition winning projects, run by InnovateUK.

“Within Highways England we have a designated fund for this kind of research, and that means we can invest in further trials and do the work needed to provide more vehicle-to-infrastructure (V2I) communications.

“Personally, I think that Level4 self-driving, eyes off and mind off, is years away, perhaps decades, certainly in terms of motorway environments. However, we are constantly in discussion with government on these issues, for example, we contributed to the recent consultation on Automated Lane Keeping Systems (ALKS).

“Working closely with industry and academia, we have already started off-road freight platooning and are looking to move to on-road trials. We’ve had lots of discussions about freight-only lanes and the left lane is often suggested, but you have to consider the design of the road network. There are lots of junctions close to each other, so how would that work, especially at motorway speeds? At first, I see self-driving more for deliveries at slower speeds in urban areas but, as always, we will listen to consumer demand.”

For further info see highwaysengland.co.uk.

Creative technologist Ushigome on future vehicle-to-pedestrian (V2P) communications.

Self-driving news flash: flickering lights to replace eye contact in facilitating trust

Our Zenzic CAM Creator series continues with Yosuke Ushigome, Director at design innovation studio Takram.

Listing his primary interest as “emerging technologies”, London-based creative technologist, Yosuke Ushigome, has been working with Toyota on future car concepts for over 10 years. Here, he gives his thoughts on the key issues in driverless car design.

Yosuke Ushigome, director Takram
Yosuke Ushigome, director Takram

YU: “We come from a user experience (UX) background and over the years our projects with Toyota have got bigger and higher level. In 2018, with the e-Palette concept, we started taking a more holistic approach to mobility and automation – an on-the-ground people perspective on the entire system, rather than the UX of an interior, exterior or service.

“There’s going to be a trend in transparency and trust. How can designers help the systems, passengers, pedestrians and others to communicate? In the past, this has usually been based around the driver and passenger, but that’s got to expand. In cars of the future, pedestrians will not be able to look into the driver’s eyes – what’s driving might not even be on the car, it might be in the cloud.

“How can you communicate interactions that facilitate trust? That’s really interesting. People pick things up from little movements in their peripheral vision, so you come back to old school ideas like patterns of flickering lights. How fast it flashes, or flashing from left to right, could give people a little nudge, maybe help them to detect danger. This kind of experimentation will definitely increase.

“Level5 autonomy seems to me to be very far off. Level4, in areas where the road system is designed for self-driving, or on private roads where there’s more separation between vehicles and pedestrians, is coming rapidly – things like deliveries between factories. Starship delivery robots are already deployed in Milton Keynes and economics will drive adoption, especially with the pandemic.

“I would like to be part of this transformation, so long as it is inclusive. There’s an opportunity to meet the needs of people left behind by our existing transport, whether that’s physical disability or economic disadvantage.”

Toyota e-Palette concept, via Takram
Toyota e-Palette concept, via Takram

Toyota had planned to showcase its e-Palette mobility solution at the Tokyo 2020 Olympic and Paralympic Games, so hopefully we’ll get to see it next summer.

For further info, visit Takram.com.

Vivacity Labs founder backs the citizen first vision of 21st century privacy.

Time for a grown-up conversation about cameras, AI, traffic flow and privacy

Our Zenzic CAM Creator series continues with the founder of Vivacity Labs, Mark Nicholson.

Vivacity uses sensors, cameras and artificial intelligence (AI) to provide “up-to-the-minute data on urban movement”, helping local councils to promote active travel, improve safety and reduce congestion. Big Brother you say? Well, it’s 2020 not 1984 and CEO Mark Nicholson is very happy to have that debate.

MN: “As the transport network becomes more complicated, local authorities need more powerful tools. Tech giants have invaded the ecosystem, and when you’re dealing with Uber and driverless cars, sending someone out with a clipboard just isn’t going to cut it. We bring new technology which tells them about their transport, so they can adapt and gain control over the ecosystem.

“We started with sensors and then video-based sensors, generating huge data sets and better quality data. We’ve looked at everything from cyclists undertaking to lockdown journey times and asked: how can we use this data to make the road system more efficient? The next phase is autonomous vehicles, because that ecosystem needs to work with both infrastructure and other road users.

“Privacy is not just a key issue in self-driving but in the whole smart city. There are basically two visions – the Chinese and the European. The Chinese vision is very invasive, it’s 1984 and that’s the point. The alternative is the European vision, with the General Data Protection Regulation (GDPR). For a while it looked like there might be a third, a corporate American vision. Google were running a smart city project in Canada, but it didn’t work out so we’re back to two models.”

If you don’t know about the Quayside project in Toronto, a much-shared Guardian article from 2019 warned of surveillance capitalism, data harvesting and the possibility that algorithms could be used to nudge behaviour in ways that favour certain businesses. You can read it here or, er, Google it.

MN: “We’re very much on the European, privacy-centric, citizen first side – an ecosystem that gives the benefits of mass data without the costs to privacy. All our data is anonymised at source, everything. Each camera or sensor unit has its own processor on board which uses AI to extract information, for example, what are the road users? The imagery is discarded within a few milliseconds, all we keep is the data. We recently looked at how socially distanced people were in Kent and, although no personal data was collected, it caused a bit of controversy.”

It did indeed. “Big Brother is watching your social distancing: Fury as traffic flow cameras are secretly switched to monitor millions of pedestrians in government-backed Covid project”, screamed the headline in the Daily Mail. We’d better get back to self-driving.

MN: “Over the last couple of years the hype around driverless cars has died down. There’s been a recognition that autonomous vehicles are not as close as Elon Musk promised. The technology is progressing though. They can drive quite well on motorways and in quiet areas, but in busy, congested areas they struggle.

“What would happen if you rolled out driverless cars today? My suspicion is they would probably perform to about the same level as human drivers. The question is: Are we happy with systemic risk rather than personal risk? Can we engineer out that risk? Can we make the infrastructure intelligent enough so it works with vehicles in even the most challenging situations?

“The best way to end the no-win scenario is to have enough data to dodge it. Most of these incidents come about due to an unforeseen element, such as a pedestrian stepping out, a cyclist skipping a red light or someone speeding round a corner. If the vehicle knows about it in advance, the trolley problem never occurs. For me it’s about having the data earlier, and how we, as representatives of infrastructure, can help to give cars that information.”

For further info, visit vivacitylabs.com.

Influential designer sees an opportunity to rethink the whole UK transport system.

Designer Priestman questions carmakers and champions elegant public transport

Our Zenzic CAM Creator series continues with award-winning designer Paul Priestman, co-founder of PriestmanGoode

Famous for designing Virgin’s Pendolino train and the BT HomeHub, Paul Priestman is one of the UK’s 500 most influential people, according to The Sunday Times. Here, he describes three exciting connected and automated mobility concepts: 1) The Moving Platforms infrastructure network; 2) A modular electric car for autonomous network transit (ANT) company, Dromos; and 3) The Scooter for Life automated electric scooter.

PP: “I’ve always been interested in mass transit and its relationship with the city. Over 30 years, the company has grown and we’re now involved in all forms of transport, even space travel. We take ideas from one sector and transfer them to others.”

Moving Platforms

PP: “This was an idea that grabbed people’s attention: a tram that can move around a city, then go to the outskirts and join a high speed rail line, without stopping, and take you to another town or even country.

PriestmanGoode Moving Platforms animation

“First and last mile is the logjam. If you can crack that then people won’t need personal transport. The cost of private car ownership is astronomical – you have to park it, maintain it, it depreciates something rotten. But carsharing isn’t working yet because the cars themselves are not designed for it – they are designed to be personal.

“There’s an opportunity to rethink the whole system from purchase through leasing to shared ownership and public for hire models, alongside designing an interior which is appropriate for these variants of use. There are a number of disruptors in the market and just as we’ve seen other markets completely transformed through disruptors such as Uber or Amazon, so there’s an opportunity to look at the car industry in the same way.

“The car industry keeps forcing the same product on us, but the market wants change. For the majority of people, especially in cities, you can’t equate private car ownership with the open road, where you can do what you want, it’s just not realistic, but I understand that there are different needs for rural and urban dwellers.

“London is an example of a great public transport system, although most of our stations were designed 150 years ago and haven’t changed much. I use an app to see when the next bus is due and then walk up to the bus stop. The bus usually arrives on time and we fly down our own lane on the Euston Road, passing all the cars stuck in traffic.”

Dromos ANT

PP: “The system is important, not just the vehicle. It is elegant public transport designed around the passenger – the first autonomous system to deliver mass transit, and the infrastructure belongs to the city. The car we designed is half the width of a normal car, with space for two or three people, and it can be steam cleaned. It’s a personal vehicle which will come to you, wherever you are, and then join a dedicated track, becoming almost like a train, before peeling off to complete the journey.”

PriestmanGoode modular electric car for Dromos
PriestmanGoode modular electric car for Dromos

At this point, Priestman refers to our interview with the arch critic of driverless cars, Christian Wolmar. PP: “The problem with some self-driving concepts is you still get traffic jams full of cars with no one in them. A lot of that congestion is caused by delivery vehicles – every time you buy something online you’re causing a traffic jam. Once you have a vehicle which has a dedicated highway you’re free from other traffic and can travel faster and closer together.”

Scooter for Life

PP: “The Scooter for Life was a special commission for the New Old exhibition at the Design Museum. We gave it three wheels, so it doesn’t fall over, and a basket for your bag or dog. It’s electric and can also be automated, so there’s a take-me-home button. People immediately think of autonomous vehicles as being car-sized, but I think they might be smaller. The only reason cars were that size in the first place was to fit in the huge engine, which you no longer need.

PriestmanGoode Scooter for Life
PriestmanGoode Scooter for Life

“People taking the tube for only a stop or two really slow things down, whereas bikes, scooters and walking mean you see more of the city. It’s a bit reclaim the streets and reminds me of the Walklines we designed years ago. The Covid situation, terrible as it is, has shown us a less congested London –an increase in the use of bikes and walking, a city moving in a much healthier way. For me, that’s much more beautiful.”

For more on these designs, and a prototype Hyperloop passenger capsule, visit priestmangoode.com.

Thomas Sors says connectivity is the essential foundation for autonomous vehicles.

Putting the C in Connected and Automated Mobility

Our Zenzic CAM Creator series continues with Beam Connectivity CEO, Thomas Sors.

Having previously led Dyson’s Connected Vehicle programme, Thomas Sors launched Beam Connectivity in January this year. It might be one of the newest cogs in the UK automotive wheel, but its Connected Vehicle as a Service (CVaaS) product is already attracting interest from car, freight and public transport manufacturers.

TS: “When it comes to connected and automated mobility (CAM) and connected and autonomous vehicles (CAVs), we see a lot of focus on the ‘A’ part, but not so much about ‘C’, which is our focus. Connectivity is the essential foundation for automation later on, but at the moment it often doesn’t perform very well. For example, OEM apps sometimes get two point something star ratings due to problems with the initial connection and latency.

“Our CVaaS solution provides a better user experience and can unlock the value of data generated by vehicle fleets. It offers a new way of getting data from vehicles to the cloud and back-end, or to send data into the vehicle. Because we’re brand new, there are no issues with legacy software – privacy by design and security by design are embedded all the way through our process, not an afterthought or a bolt-on. That starts with ensuring that we fulfil General Data Protection Regulation (GDPR) access rights, including the right to be forgotten.

“I’ve seen quotes that by 2030 all cars will have some form of connectivity. eCall [the EU initiative to enable cars to automatically contact the emergency services in the event of a serious accident] is mandatory for new cars, and that’s just the start. It’s about transparency and explaining the benefits. If you give people the option to say ‘yes, take this data in order for me to get feature X’, then that builds trust.

“From the manufacturer or fleet operator perspective, prognostics is an interesting area – fixing things before they go wrong. Then there’s the ability to understand usage patterns and perform over the air (OTA) updates. Another thing we’re already seeing is support to improve the driving experience, for example, vehicle to infrastructure communications being used to reduce congestion. We expect that to build up quickly over the next 2-4 years.

“We’re only a few months in but we’ve already deployed an end-to-end system to several vehicles and we’re looking to do more and more. It’s not unusual for manufacturers to spend 12-18 months building a connected vehicle solution, so our platform can really speed up their development lifecycle. Why build a connectivity team when we’ve already done it very effectively?

“As to self-driving, the technology is leading the way and moving along quickly, so the focus needs to be on standards, legislation and public acceptance.”

For further info, visit beamconnectivity.com.