Shared micromobility giant Lime is finally bringing on some of its own city-appeasing advanced rider assistance system (ARAS) technology. At a Lime event in Paris, the startup shared plans to pilot an in-house built computer vision platform that will leverage cameras to detect when users are riding on the sidewalk. While it will be at the discretion of the cities whether to both audibly alert the riders to their transgressions and actually slow them down, both functions are available.

Lime will be piloting the tech on close to 400 scooters in San Francisco and Chicago starting early to mid-August. By the end of the year, Lime hopes to expand its pilot to six cities in total, including Paris, where the company held a demo of the new tech on Wednesday.

Because it’s easier for cities to blame micromobility companies and scooter riders for sidewalk riding, rather than invest the proper time and money in building protected bike lanes, almost every major operator has been implementing some form of scooter ARAS over the past year.

Bird, Superpedestrian and Neuron rely on a location-based systems to determine where scooters are riding and parking. And Voi, Spin and Zipp have all piloted the computer vision tech of third party providers, Drover AI and Luna. Lime said it, too, has piloted third party computer vision systems to test the viability of the business model before investing in its own system, which will be the first operator-built camera-based platform to detect sidewalk riding. That said, it’s not the first company to integrate such a system into a scooter. Segway, which supplies many micromobility companies, recently announced its own AI-powered scooter.

Both scooter ARAS camps — localization and computer vision — have their champions. Localization advocates say their tech is cheaper and easier to implement today, whereas computer vision tech is just not there yet in order to be effective. Not to mention it’s expensive and adds yet another hardware element that may break or be vandalized on the streets.

Joe Kraus, president at Lime, says the company is taking a long-term bet by investing in computer vision, which he reckons will actually be cheaper than augmented GPS in the long run while also providing more applicable use cases.

“I think about where is massive money being invested in order to bring things down the cost curve and to bring capability up the innovation curve,” Kraus told TechCrunch. “The amount of investment that is going into making camera-based systems do more incredible things, like the accuracy of image classification and detection, is huge. There’s a huge amount of investment going onto the open source software side, and as well as on the chipset side to make AI chips on the edge dramatically cheaper and more power efficient.”

To improve localization-based scooter ARAS, getting hyper-accurate maps is crucial to pinpointing where the sidewalks actually are. Kraus doesn’t believe the same level of demand, investment and performance curve is happening for GPS signaling.

“The cost of GPS augmentation inside of a city with more accurate timing signals is not cheap,” said Kraus.

GPS also doesn’t open up as many potential doors for new features, the executive argued. Lime is starting off by detecting sidewalk riding, but it has plans to implement other use cases, like parking detection. Currently, Lime and Bird are both piloting Google’s ARCore Geospatial API, which uses a rider’s smartphone camera to geo-locate parked scooters and e-bikes accurately to determine if they’re parked correctly.

Lime might also choose to use its computer vision tech for “broader localization efforts” or “non-repudiation in the case of accidents,” according to Kraus. Lime Vision, as the feature is called, can also help Lime win brownie points with cities by sharing data on things like hotspots for sidewalk riding to help inform where to actually build bike lanes, or the number of potholes on certain streets.

The first version of Lime Vision which will be tested with the initial pilot is a retrofittable, waterproof unit that affixes to the neck of the scooter below the handlebars and contains the camera, an AI chip and a CPU, or a processor. It’s here where Lime’s computer vision model does its computations in real-time, allowing for detections in less than a second, according to Kraus. The unit is connected via a wire to the scooter’s brain so it can communicate commands based on what it analyzes.

Lime is already working on the second version of its computer vision system that would be fully integrated into the body of the vehicle, rather than a separate attachment, said Kraus.

In addition to its computer vision launch, Lime said on Wednesday it plans to pilot a new reaction test to inhibit drunk riding in 30 cities globally. The test, which will pop up on the Lime app when a rider tries to book a ride after a designated time of night, appears as a sort of game: The user controls a scooter as it rides along a street and is asked to hit the stop sign when it appears. Riders who don’t hit that stop sign within a certain amount of time will not be able to start a ride.

Previously, Lime had tried out a similar feature that activated after 10 p.m. in most markets. It would ask riders to type in a “Y-E-S” in response to the question, “Do you affirm you are not drunk and fit to ride?” Last year, Bird launched Safe Start, an in-app checkpoint that asked riders to enter a keyword into the app that it hoped would deter drunk people from riding.